20764C ENU TrainerHandbook
20764C ENU TrainerHandbook
20764C
Administering a SQL Database
Infrastructure
MCT USE ONLY. STUDENT USE PROHIBITED
ii Administering a SQL Database Infrastructure
Information in this document, including URL and other Internet Web site references, is subject to change
without notice. Unless otherwise noted, the example companies, organizations, products, domain names,
e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with
any real company, organization, product, domain name, e-mail address, logo, person, place or event is
intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the
user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in
or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical,
photocopying, recording, or otherwise), or for any purpose, without the express written permission of
Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property
rights covering subject matter in this document. Except as expressly provided in any written license
agreement from Microsoft, the furnishing of this document does not give you any license to these
patents, trademarks, copyrights, or other intellectual property.
The names of manufacturers, products, or URLs are provided for informational purposes only and
Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding
these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a
manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links
may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not
responsible for the contents of any linked site or any link contained in a linked site, or any changes or
updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission
received from any linked site. Microsoft is providing these links to you only as a convenience, and the
inclusion of any link does not imply endorsement of Microsoft of the site or the products contained
therein.
© 2018 Microsoft Corporation. All rights reserved.
Released: 02/2018
MCT USE ONLY. STUDENT USE PROHIBITED
MICROSOFT LICENSE TERMS
MICROSOFT INSTRUCTOR-LED COURSEWARE
These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its
affiliates) and you. Please read them. They apply to your use of the content accompanying this agreement which
includes the media on which you received it, if any. These license terms also apply to Trainer Content and any
updates and supplements for the Licensed Content unless other terms accompany those items. If so, those terms
apply.
BY ACCESSING, DOWNLOADING OR USING THE LICENSED CONTENT, YOU ACCEPT THESE TERMS.
IF YOU DO NOT ACCEPT THEM, DO NOT ACCESS, DOWNLOAD OR USE THE LICENSED CONTENT.
If you comply with these license terms, you have the rights below for each license you acquire.
1. DEFINITIONS.
a. “Authorized Learning Center” means a Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, or such other entity as Microsoft may designate from time to time.
b. “Authorized Training Session” means the instructor-led training class using Microsoft Instructor-Led
Courseware conducted by a Trainer at or through an Authorized Learning Center.
c. “Classroom Device” means one (1) dedicated, secure computer that an Authorized Learning Center owns
or controls that is located at an Authorized Learning Center’s training facilities that meets or exceeds the
hardware level specified for the particular Microsoft Instructor-Led Courseware.
d. “End User” means an individual who is (i) duly enrolled in and attending an Authorized Training Session
or Private Training Session, (ii) an employee of a MPN Member, or (iii) a Microsoft full-time employee.
e. “Licensed Content” means the content accompanying this agreement which may include the Microsoft
Instructor-Led Courseware or Trainer Content.
f. “Microsoft Certified Trainer” or “MCT” means an individual who is (i) engaged to teach a training session
to End Users on behalf of an Authorized Learning Center or MPN Member, and (ii) currently certified as a
Microsoft Certified Trainer under the Microsoft Certification Program.
g. “Microsoft Instructor-Led Courseware” means the Microsoft-branded instructor-led training course that
educates IT professionals and developers on Microsoft technologies. A Microsoft Instructor-Led
Courseware title may be branded as MOC, Microsoft Dynamics or Microsoft Business Group courseware.
h. “Microsoft IT Academy Program Member” means an active member of the Microsoft IT Academy
Program.
i. “Microsoft Learning Competency Member” means an active member of the Microsoft Partner Network
program in good standing that currently holds the Learning Competency status.
j. “MOC” means the “Official Microsoft Learning Product” instructor-led courseware known as Microsoft
Official Course that educates IT professionals and developers on Microsoft technologies.
k. “MPN Member” means an active Microsoft Partner Network program member in good standing.
MCT USE ONLY. STUDENT USE PROHIBITED
l. “Personal Device” means one (1) personal computer, device, workstation or other digital electronic device
that you personally own or control that meets or exceeds the hardware level specified for the particular
Microsoft Instructor-Led Courseware.
m. “Private Training Session” means the instructor-led training classes provided by MPN Members for
corporate customers to teach a predefined learning objective using Microsoft Instructor-Led Courseware.
These classes are not advertised or promoted to the general public and class attendance is restricted to
individuals employed by or contracted by the corporate customer.
n. “Trainer” means (i) an academically accredited educator engaged by a Microsoft IT Academy Program
Member to teach an Authorized Training Session, and/or (ii) a MCT.
o. “Trainer Content” means the trainer version of the Microsoft Instructor-Led Courseware and additional
supplemental content designated solely for Trainers’ use to teach a training session using the Microsoft
Instructor-Led Courseware. Trainer Content may include Microsoft PowerPoint presentations, trainer
preparation guide, train the trainer materials, Microsoft One Note packs, classroom setup guide and Pre-
release course feedback form. To clarify, Trainer Content does not include any software, virtual hard
disks or virtual machines.
2. USE RIGHTS. The Licensed Content is licensed not sold. The Licensed Content is licensed on a one copy
per user basis, such that you must acquire a license for each individual that accesses or uses the Licensed
Content.
2.1 Below are five separate sets of use rights. Only one set of rights apply to you.
2.2 Separation of Components. The Licensed Content is licensed as a single unit and you may not
separate their components and install them on different devices.
2.3 Redistribution of Licensed Content. Except as expressly provided in the use rights above, you may
not distribute any Licensed Content or any portion thereof (including any permitted modifications) to any
third parties without the express written permission of Microsoft.
2.4 Third Party Notices. The Licensed Content may include third party code tent that Microsoft, not the
third party, licenses to you under this agreement. Notices, if any, for the third party code ntent are included
for your information only.
2.5 Additional Terms. Some Licensed Content may contain components with additional terms,
conditions, and licenses regarding its use. Any non-conflicting terms in those conditions and licenses also
apply to your use of that respective component and supplements the terms described in this agreement.
a. Pre-Release Licensed Content. This Licensed Content subject matter is on the Pre-release version of
the Microsoft technology. The technology may not work the way a final version of the technology will
and we may change the technology for the final version. We also may not release a final version.
Licensed Content based on the final version of the technology may not contain the same information as
the Licensed Content based on the Pre-release version. Microsoft is under no obligation to provide you
with any further content, including any Licensed Content based on the final version of the technology.
b. Feedback. If you agree to give feedback about the Licensed Content to Microsoft, either directly or
through its third party designee, you give to Microsoft without charge, the right to use, share and
commercialize your feedback in any way and for any purpose. You also give to third parties, without
charge, any patent rights needed for their products, technologies and services to use or interface with
any specific parts of a Microsoft technology, Microsoft product, or service that includes the feedback.
You will not give feedback that is subject to a license that requires Microsoft to license its technology,
technologies, or products to third parties because we include your feedback in them. These rights
survive this agreement.
c. Pre-release Term. If you are an Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, MPN Member or Trainer, you will cease using all copies of the Licensed Content on
the Pre-release technology upon (i) the date which Microsoft informs you is the end date for using the
Licensed Content on the Pre-release technology, or (ii) sixty (60) days after the commercial release of the
technology that is the subject of the Licensed Content, whichever is earliest (“Pre-release term”).
Upon expiration or termination of the Pre-release term, you will irretrievably delete and destroy all copies
of the Licensed Content in your possession or under your control.
MCT USE ONLY. STUDENT USE PROHIBITED
4. SCOPE OF LICENSE. The Licensed Content is licensed, not sold. This agreement only gives you some
rights to use the Licensed Content. Microsoft reserves all other rights. Unless applicable law gives you more
rights despite this limitation, you may use the Licensed Content only as expressly permitted in this
agreement. In doing so, you must comply with any technical limitations in the Licensed Content that only
allows you to use it in certain ways. Except as expressly permitted in this agreement, you may not:
• access or allow any individual to access the Licensed Content if they have not acquired a valid license
for the Licensed Content,
• alter, remove or obscure any copyright or other protective notices (including watermarks), branding
or identifications contained in the Licensed Content,
• modify or create a derivative work of any Licensed Content,
• publicly display, or make the Licensed Content available for others to access or use,
• copy, print, install, sell, publish, transmit, lend, adapt, reuse, link to or post, make available or
distribute the Licensed Content to any third party,
• work around any technical limitations in the Licensed Content, or
• reverse engineer, decompile, remove or otherwise thwart any protections or disassemble the
Licensed Content except and only to the extent that applicable law expressly permits, despite this
limitation.
5. RESERVATION OF RIGHTS AND OWNERSHIP. Microsoft reserves all rights not expressly granted to
you in this agreement. The Licensed Content is protected by copyright and other intellectual property laws
and treaties. Microsoft or its suppliers own the title, copyright, and other intellectual property rights in the
Licensed Content.
6. EXPORT RESTRICTIONS. The Licensed Content is subject to United States export laws and regulations.
You must comply with all domestic and international export laws and regulations that apply to the Licensed
Content. These laws include restrictions on destinations, end users and end use. For additional information,
see www.microsoft.com/exporting.
7. SUPPORT SERVICES. Because the Licensed Content is “as is”, we may not provide support services for it.
8. TERMINATION. Without prejudice to any other rights, Microsoft may terminate this agreement if you fail
to comply with the terms and conditions of this agreement. Upon termination of this agreement for any
reason, you will immediately stop all use of and delete and destroy all copies of the Licensed Content in
your possession or under your control.
9. LINKS TO THIRD PARTY SITES. You may link to third party sites through the use of the Licensed
Content. The third party sites are not under the control of Microsoft, and Microsoft is not responsible for
the contents of any third party sites, any links contained in third party sites, or any changes or updates to
third party sites. Microsoft is not responsible for webcasting or any other form of transmission received
from any third party sites. Microsoft is providing these links to third party sites to you only as a
convenience, and the inclusion of any link does not imply an endorsement by Microsoft of the third party
site.
10. ENTIRE AGREEMENT. This agreement, and any additional terms for the Trainer Content, updates and
supplements are the entire agreement for the Licensed Content, updates and supplements.
12. LEGAL EFFECT. This agreement describes certain legal rights. You may have other rights under the laws
of your country. You may also have rights with respect to the party from whom you acquired the Licensed
Content. This agreement does not change your rights under the laws of your country if the laws of your
country do not permit it to do so.
13. DISCLAIMER OF WARRANTY. THE LICENSED CONTENT IS LICENSED "AS-IS" AND "AS
AVAILABLE." YOU BEAR THE RISK OF USING IT. MICROSOFT AND ITS RESPECTIVE
AFFILIATES GIVES NO EXPRESS WARRANTIES, GUARANTEES, OR CONDITIONS. YOU MAY
HAVE ADDITIONAL CONSUMER RIGHTS UNDER YOUR LOCAL LAWS WHICH THIS AGREEMENT
CANNOT CHANGE. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAWS, MICROSOFT AND
ITS RESPECTIVE AFFILIATES EXCLUDES ANY IMPLIED WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
14. LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES. YOU CAN RECOVER FROM
MICROSOFT, ITS RESPECTIVE AFFILIATES AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP
TO US$5.00. YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL,
LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.
It also applies even if Microsoft knew or should have known about the possibility of the damages. The
above limitation or exclusion may not apply to you because your country may not allow the exclusion or
limitation of incidental, consequential or other damages.
Please note: As this Licensed Content is distributed in Quebec, Canada, some of the clauses in this
agreement are provided below in French.
Remarque : Ce le contenu sous licence étant distribué au Québec, Canada, certaines des clauses
dans ce contrat sont fournies ci-dessous en français.
EXONÉRATION DE GARANTIE. Le contenu sous licence visé par une licence est offert « tel quel ». Toute
utilisation de ce contenu sous licence est à votre seule risque et péril. Microsoft n’accorde aucune autre garantie
expresse. Vous pouvez bénéficier de droits additionnels en vertu du droit local sur la protection dues
consommateurs, que ce contrat ne peut modifier. La ou elles sont permises par le droit locale, les garanties
implicites de qualité marchande, d’adéquation à un usage particulier et d’absence de contrefaçon sont exclues.
EFFET JURIDIQUE. Le présent contrat décrit certains droits juridiques. Vous pourriez avoir d’autres droits
prévus par les lois de votre pays. Le présent contrat ne modifie pas les droits que vous confèrent les lois de votre
pays si celles-ci ne le permettent pas.
Acknowledgements
Microsoft Learning would like to acknowledge and thank the following for their contribution towards
developing this title. Their effort at various stages in the development has ensured that you have a good
classroom experience.
Contents
Module 1: SQL Server Security
Module Overview 1-1
Lesson 1: Authenticating Connections to SQL Server 1-2
Module 10 Lab: Monitoring SQL Server with Alerts and Notifications L10-1
Course Description
This five-day instructor-led course provides students who administer and maintain SQL Server databases
with the knowledge and skills to administer a SQL server database infrastructure. Additionally, it will be of
use to individuals who develop applications that deliver content from SQL Server databases.
Audience
The primary audience for this course is individuals who administer and maintain SQL Server databases.
These individuals perform database administration and maintenance as their primary area of
responsibility, or work in environments where databases play a key role in their primary job.
The secondary audiences for this course are individuals who develop applications that deliver content
from SQL Server databases.
Student Prerequisites
In addition to their professional experience, students who attend this training should already have the
following technical knowledge:
Basic knowledge of the Microsoft Windows operating system and its core functionality.
Course Objectives
After completing this course, students will be able to:
Course Outline
The course outline is as follows:
Module 1: ‘SQL Server Security’ introduces SQL Server security models, logins and users.
Module 2: ‘Assigning Server and Database Roles’ covers fixed server roles, user-defined server roles,
fixed database roles and user-defined database roles.
Module 3: ‘Authorizing Users to Access Resources’ describes permissions and the assignment of
permissions,
Module 4: ‘Protecting data with Encryption and Auditing’ describes SQL Server Audit.
Module 5: ‘Recovery models and Backup Strategies’ describes the concept of the transaction log and
SQL Server recovery models. It also introduces the different backup strategies available with SQL
Server.
Module 6: ‘Backup of SQL Server databases’ describes the SQL Server backup and the backup types.
Module 8: ‘Automating SQL Server management’ describes how to use SQL Server agent for
automation. It also explains the benefits of using master and target servers to centralize the
administration of automation.
Module 9: ‘Configuring Security for SQL Server agent’ describes the considerations for SQL Server
agent security, including proxy accounts and credentials.
Module 10: ‘Monitoring SQL Server with Alerts and Notifications’ covers the configuration of
database mail, alerts and notifications.
Module 11: ‘Introduction to managing SQL Server by using PowerShell’ introduces PowerShell for SQL
Server.
Module 12: ‘Tracing Access to SQL Server with Extended Events’ introduces how to use SQL Profiles
and SQL Trace stored procedures to capture information about SQL Server. The module also describes
how to use Distributed Replay to capture trace information from multiple servers and how to monitor
locking.
Module 13: ‘Monitoring SQL Server’ explains how to use Distributed Management Views to monitor
SQL Server. It also describes configuration of data collection and the SQL Server utility.
Module 14: ‘Troubleshooting SQL server’ explains the SQL Server troubleshooting methodology and
discusses the most common issues that arise when working with SQL Server systems.
Module 15: ‘Importing and exporting data’ covers the use of import/export wizards and explains how
they relate to SSIS. The module also introduces BCP and BULK INSERT.
MCT USE ONLY. STUDENT USE PROHIBITED
About This Course iii
Course Materials
The following materials are included with your kit:
Course Handbook: a succinct classroom learning guide that provides the critical technical
information in a crisp, tightly-focused format, which is essential for an effective in-class learning
experience.
o Lessons: guide you through the learning objectives and provide the key points that are critical to
the success of the in-class learning experience.
o Labs: provide a real-world, hands-on platform for you to apply the knowledge and skills learned
in the module.
o Module Reviews and Takeaways: provide on-the-job reference material to boost knowledge
and skills retention.
Modules: include companion content, such as questions and answers, detailed demo steps and
additional reading links, for each lesson. Additionally, they include Lab Review questions and answers
and Module Reviews and Takeaways sections, which contain the review questions and answers, best
practices, common issues and troubleshooting tips with answers, and real-world issues and scenarios
with answers.
Resources: include well-categorized additional resources that give you immediate access to the most
current premium content on TechNet, MSDN®, or Microsoft® Press®.
Course evaluation: at the end of the course, you will have the opportunity to complete an online
evaluation to provide feedback on the course, training facility, and instructor.
Note: At the end of each lab, you must revert the virtual machines to a snapshot. You can
find the instructions for this procedure at the end of each lab
The following table shows the role of each virtual machine that is used in this course:
Software Configuration
The following software is installed on the virtual machines:
SQL2017
SharePoint 2017
Course Files
The files associated with the labs in this course are located in the D:\Labfiles folder on the 20764C-MIA-
SQL virtual machine.
Classroom Setup
Each classroom computer will have the same virtual machine configured in the same way.
DVD drive
Network adapter
Additionally, the instructor’s computer must be connected to a projection display device that supports
SVGA 1024×768 pixels, 16-bit colors.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
1-1
Module 1
SQL Server Security
Contents:
Module Overview 1-1
Lesson 1: Authenticating Connections to SQL Server 1-2
Module Overview
Protection of data within your Microsoft® SQL Server® databases is essential and requires a working
knowledge of the issues and SQL Server security features.
This module describes SQL Server security models, logins, users, partially contained databases, and cross-
server authorization.
Objectives
After completing this module, you will understand, and have a working knowledge of:
Lesson 1
Authenticating Connections to SQL Server
Security implementation for SQL Server usually begins at the server level, where users are authenticated,
based on logins, and organized into server-level roles to make it easier to manage permissions.
Lesson Objectives
After completing this lesson, you will be able to:
Understand how the Azure® SQL database firewall operates and is configured.
Manage logins and policies in SQL Server.
Security Hierarchies
Security architectures are often hierarchical, primarily to simplify management of permissions. In a
hierarchical security architecture, securables can contain other securables; for example, a folder can
contain files, and principals can contain other principals—where users are added to a group, for example.
Permissions are usually inherited, both by hierarchies of securables (for example, granting “read”
permission on a folder implicitly grants “read” permission on the files it contains), and by hierarchies of
principals (for example, granting “read” permission to a group implicitly grants read permission to all
users who are members of that group). Generally, inherited permissions can be explicitly overridden at
different hierarchy levels, to fine-tune access.
Fewer individual permissions need to be granted, reducing the risk of misconfiguration. You can set
the general permissions that are required at the highest level in the hierarchy, and only apply explicit
overriding permissions further down the hierarchy, to handle exceptional cases.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-3
After the permissions have been set, they can be controlled through group membership. This makes it
easier to manage permissions in environments where new users arrive and existing users leave or
change roles.
Best Practice: When planning a security solution, consider the following best practices:
Provide each principal with only the permissions they actually need.
Use securable inheritance to minimize the number of implicit permissions that must be set to enable
the required level of access.
Use principal containers, such as groups or roles, to create a layer of abstraction between principals
and permissions to access securables. You can then use membership of these groups to control access
to resources via the permissions you have defined. Changes in personnel should not require changes
to permissions.
Note: Azure Active Directory®. Azure Active Directory (Azure AD) means you can
manage user authentication (and other Microsoft services) in a single, central location.
Windows Authentication
SQL Server checks the provided user name and password against the Windows user details. SQL Server
does not require a password. This is the default authentication mode. Windows user security controls
access to SQL Server. Access is granted based on an access token issued when the user logged in to the
Windows session.
Note: Kerberos security protocol is used to provide security policies such as account
locking, strong passwords, and password expiration. The Kerberos protocol is supported by SQL
Server over the TCP/IP, named pipes, and shared memory communication protocols.
Because groups of users can be created in Windows, domain access administration is simplified.
MCT USE ONLY. STUDENT USE PROHIBITED
1-4 SQL Server Security
SQL Server authentication requires a login when the application is started. The user name and password
are stored within database tables, and so are separate from the Windows authentication. The following
optional policies can be applied:
Change Password at Next Login. This feature is enabled in SSQ Server Management Studio.
Password Expiration. You can set a maximum age policy for passwords.
Windows Password Policy. You can enforce Windows policies for passwords. For example, minimum
and maximum password lengths, and other rules that define minimum standards for passwords. For
example, forcing the user to include capitals, numbers, and punctuation marks in their passwords.
Note:
o If you install SQL Server using mixed mode authentication, setup enables a SQL Server login
called sa. It is important to create a complex password for this login because it has administrative
rights at database server level.
o If you install using Windows authentication mode, then change to mixed authentication later, this
does not enable the sa login.
Use Azure Active Directory Authentication for authentication with SQL Database or SQL Data
Warehouse
http://aka.ms/Wo9lg9
Note: Encrypted authentication only applies to clients running the SQL Server 2005 version
of SNAC or later. If an earlier client that does not understand encrypted authentication tries to
connect, by default SQL Server will allow the connection, but the username and password will be
transmitted in plain text. If this is a concern, you can use SQL Server Configuration Manager to
disallow unencrypted authentication from down-level clients by setting Force Encryption
to Yes.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-5
To find information about SQL Server Authentication modes, see Choose an Authentication Mode in
Microsoft Docs:
Microsoft Azure
http://aka.ms/Os6j0m
Firewall Rules
There are two levels at which you configure Azure Firewall Rules:
Server. You can configure rules that are stored on the master database to allow principals access to
your server.
Database. You can configure rules to allow principals access to individual databases. These rules are
stored on the specific databases. For example, you might want certain principals to be denied access
to a financial database but allow access to a marketing database.
Best Practice: Microsoft recommends using database-level firewall rules. You should
consider using server rules if you have many databases with the same access requirements.
Note: Before proceeding, you need to ask your systems administrators for your outbound
IP address range.
MCT USE ONLY. STUDENT USE PROHIBITED
1-6 SQL Server Security
The initial state for your Azure Database Firewall is to block all access. To enable access, you should go to
the Azure Portal and create one or more server-level and database-level rules containing the IP address
range you are using. You should also specify whether Azure applications can connect to your Azure SQL
Database server.
To allow server access, you create a server-level rule containing the IP address range.
To allow access to a specific database, you create a database-level rule configured as follows:
o Make sure that the IP address the principal is using is outside of the one that is specified in the
server-level rule.
o Ensure the IP address of the principal falls within the range specified by the database-level rule.
For more information on how to configure Azure firewall settings, see the Manage firewall rules using the
Azure portal section of Azure SQL Database server-level and database-level firewall rules in the Microsoft
Azure documentation:
Azure SQL Database server-level and database-level firewall rules
http://aka.ms/iddzy1
2. Database Level. If the client IP address is not within the server-level range, then the database-level
rules are checked. If the IP address falls within an IP range linked to one of the database-level rules,
then access is granted to all databases that the IP address has been assigned to.
3. Connection Denied. If the client IP address falls outside of both server-level and database-level rules,
the connection will be denied.
Note: Local Device. To access the Azure SQL Database, the firewall on your local device
TCP port 1433 must allow outgoing communications.
Transact-SQL
You can execute a query from SQL Server Management Studio or the Classic Azure Portal. You need to
connect to the master database, and then you can select, create, update or delete firewall rules.
For more information about querying Azure firewall settings using Transact-SQL, see the Manage firewall
rules using Transact-SQL section of Azure SQL Database server-level and database-level firewall rules in the
Microsoft Azure documentation:
Azure PowerShell™
To manage your firewall rules using Azure PowerShell, see the Manage firewall rules using Azure
PowerShell section of Azure SQL Database server-level and database-level firewall rules in the Microsoft
Azure documentation:
REST API
To manage your firewall rules using REST API, see the Manage firewall rules using REST API section of
Azure SQL Database server-level and database-level firewall rules in the Microsoft Azure documentation:
Local Firewall Rules. TCP Port 1433 must be allowed. If you are inside the Azure cloud boundary,
you might have to create rules for additional ports.
Network Address Translation (NAT). Your local device might use a different IP address to the one
you use to connect to Azure SQL Database. You can go to the portal and configure the Azure
Database Firewall to use the Current Client IP Address.
Dynamic IP Addresses. In some cases, you might have to ask your Internet Service Provider (ISP) for
the range they use for your device. You can also obtain a static address.
For more detailed information about troubleshooting your Azure Database Firewall, see the
Troubleshooting the database firewall section of Azure SQL Database server-level and database-level
firewall rules in the Microsoft Azure documentation:
For more information about ports beyond 1433 for ADO.NET 4.5, see Ports beyond 1433 for ADO.NET 4.5
in the Microsoft Azure documentation:
Creating Logins
To create a login by using SSMS, expand the
Security node for the relevant server instance,
right-click Logins, and then click New Login.
Complete the details in the Login - New dialog box
to configure the login that you require.
Alternatively, you can create logins by using the
CREATE LOGIN Transact-SQL statement.
In this example, a login named ADVENTUREWORKS\SalesReps is created for the Windows group of the
same name. The default database for the user will be salesdb. If you do not specify this option, the
default database is set to master.
Note: Windows user and group names must be enclosed in square brackets because they
contain a backslash character.
You create SQL Server logins in the same way. There are, however, additional arguments that are only
relevant to SQL Server logins (for example, the PASSWORD argument).
The following example shows how to create a login named DanDrayton and assign a password of
Pa55w.rd:
When you create a SQL Server login, you can specify the following options to control how the password
policy is enforced:
MUST_CHANGE: SQL Server will prompt the user to change their password the next time they log
on. You must ensure that whatever client application the user will use to connect to SQL Server
supports this. The default value for this setting is ON when using the user interface to create a login,
but off when using a Transact-SQL CREATE LOGIN statement.
CHECK_POLICY = {ON | OFF}: Setting this value to ON enforces the password complexity policy for
this user. The default value for this setting is ON.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-9
CHECK_EXPIRATION = {ON | OFF}: Setting this value to ON enables password expiration, forcing
the user to change their password at regular intervals. The default value for this setting is ON when
using the user interface to create a login, but off when using a Transact-SQL CREATE LOGIN
statement.
You can configure policy settings for a SQL Server login in SSMS, or in the CREATE LOGIN or ALTER LOGIN
statement. The following code example modifies the DanDrayton login created earlier to explicitly disable
policy checking:
The full application of account policy is not always desirable. For example, some applications use fixed
credentials to connect to the server. Often, these applications do not support the regular changing of
login passwords. In these cases, it is common to disable password expiration for those logins.
You can reset passwords by using SSMS or the ALTER LOGIN Transact-SQL statement.
Changing a Password
ALTER LOGIN DanDrayton
WITH OLD_PASSWORD = 'Pa55w.rd',
PASSWORD = 'NewPa55w.rd';
The following code shows how to use the ALTER LOGIN statement to disable a login:
Disabling a Login
ALTER LOGIN DanDrayton DISABLE;
You can remove logins from a server by using the DROP LOGIN statement or SSMS. If a user is currently
logged in, you cannot drop their login without first ending their session.
Dropping a Login
DROP LOGIN DanDrayton;
MCT USE ONLY. STUDENT USE PROHIBITED
1-10 SQL Server Security
Create logins.
Manage server-level roles.
Demonstration Steps
Set the Authentication Mode
1. Ensure that the 20764C-MIA-DC and 20764C-MIA-SQL virtual machines are running, and log on to
20764C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.
3. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.
4. Start SQL Server Management Studio, and connect to the MIA-SQL database engine using Windows
authentication.
6. In the Server Properties - MIA-SQL dialog box, on the Security page, verify that SQL Server and
Windows Authentication mode is selected, and then click Cancel.
Create Logins
1. In Object Explorer, expand Security, and expand Logins to view the logins that are currently defined
on this server instance.
4. In the Select User, Service Account, or Group dialog box, click Object Types.
5. In the Object Types dialog box, ensure only Users and Groups are selected, and then click OK.
6. In the Select User, Service Account, or Group dialog box, click Locations.
7. In the Locations dialog box, expand Entire Directory, click adventureworks.msft, and then click
OK.
8. In the Select User, Service Account, or Group dialog box, click Advanced.
9. In the Select User, Service Account, or Group dialog box, click Find Now. This produces a list of all
users and groups in the Active Directory domain.
10. In the list of domain objects, click HumanResources_Users (this is a domain local group that
contains multiple global groups, each of which in turn contains users), and then click OK.
11. In the Select User, Service Account, or Group dialog box, ensure that HumanResources_Users is
listed, and then click OK.
12. In the Login - New dialog box, in the Default database list, click AdventureWorks, and then click
OK.
15. In the Login - New dialog box, in the Login name box, type Payroll_Application, and then click
SQL Server authentication.
16. Enter and confirm the password Pa55w.rd, and then clear the Enforce password expiration check
box (which automatically clears the User must change password at next login check box).
17. In the Default database list, click AdventureWorks, and then click OK.
18. In Object Explorer, to the Logins folder, verify that the Payroll_Application login is added.
19. Open the CreateLogins.sql script file in the D:\Demofiles\Mod01 folder and review the code it
contains. This creates a Windows login for the ADVENTUREWORKS\AnthonyFrizzell user and the
ADVENTUREWORKS\Database_Managers local group, and a SQL Server login named
Web_Application.
20. Click Execute, and when the script has completed successfully, in Object Explorer, refresh the Logins
folder and verify that the logins have been created.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Lesson 2
Authorizing Logins to Connect to Databases
After creating a login, you should give that login access to at least one database before they can connect
to the server. Generally, you only need to enable logins to access the databases that they need to work
with. You can do this by creating a database user for the login in each database that it must access.
In this lesson, you will see how to create and manage database-level authorization, including database
users and database roles.
Lesson Objectives
After completing this lesson, you will be able to:
Configure principals (for example, users) with passwords so that they can access the relevant
databases.
You can also create database users by using the CREATE USER statement.
-- Example 1
CREATE USER SalesReps
FOR LOGIN [ADVENTUREWORKS\SalesReps]
WITH DEFAULT_SCHEMA = Sales;
-- Example 2
CREATE USER DanDrayton
FOR LOGIN DanDrayton;
-- Example 3
CREATE USER WebUser
FOR LOGIN [ADVENTUREWORKS\WebAppSvcAcct];
Note: The names of Windows logins must be enclosed in square brackets “[…]” because
they contain a backslash “\” character.
Schemas are namespaces that are used to organize objects in the database. If no default schema is
specified when a user is created, then dbo is used.
Example 2. No default schema is specified, so the user will default to the dbo schema.
Example 3. No default schema is specified, so the schema will be dbo. Also note that the username is
different to the login with which it is associated.
You can remove users from a database by using the DROP USER statement or SSMS. However, you cannot
drop a database user who owns any securable object (for example, tables or views).
This solves the issue but, if you later restore the database on the same or a different server, the problem
will reoccur. A better way of avoiding the problem is by using the WITH SID clause when you create the
login.
MCT USE ONLY. STUDENT USE PROHIBITED
1-14 SQL Server Security
Note: Managing mismatched SIDs, orphaned users (those whose logins are disconnected
when a database is moved to another SQL Server instance), impersonation and delegation are
discussed in a later lesson, Authorization Across Servers.
dbo User
The dbo user is a special user who has permissions
to perform all activities in the database. Any
member of the sysadmin fixed server role (including
the sa user when using mixed mode authentication)
who uses a database is mapped to the special
database user called dbo. You cannot delete the dbo database user and it is always present in every
database.
Database Ownership
Like other securable objects in SQL Server, databases also have owners—these are mapped to the dbo
user.
The following example shows how you can modify the owner of a database by using the ALTER
AUTHORIZATION statement:
Any schema created by a login mapped to the dbo user will automatically have dbo as its owner. By
default, objects within a schema have their owner set to NULL and inherit the owner of the schema in
which they are defined. Owners of objects have full access to the objects and do not require explicit
permissions before they can perform operations on those objects.
guest User
The guest user account enables logins that are not mapped to a database user in a particular database to
gain access to that database. Login accounts assume the identity of the guest user when the following
conditions are met:
The login has access to SQL Server, but not the database, through its own database user mapping.
You can enable the guest account in a database to give anyone with a valid SQL Server login access to it.
The guest username is automatically a member of the public role. (Roles will be discussed in a later
module.)
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-15
SQL Server checks to see whether the login that is trying to access the database is mapped to a
database user in that database. If it is, SQL Server grants the login access to the database as that
database user.
If the login is not mapped to a database user, SQL Server then checks to see whether the guest
database user is enabled. If it is, the login is granted access to the database as guest. If the guest
account is not enabled, SQL Server denies access to the database for that login.
You cannot drop the guest user from a database, but you can prevent it from accessing the database by
using the REVOKE CONNECT statement. Conversely, you can enable the guest account by using the
GRANT CONNECT statement.
Note: By default, the guest user is enabled in the master, msdb, and tempdb databases.
You should not try to revoke the guest access to these databases.
Briefly, instead of SQL Server directly authorizing a principal such as a client application or user, the
principal is passed to an STS—such as WIF.
For example, for a Windows user attempting to connect to a SQL Server database:
2. WIF will build a security token based on the records it holds about the user.
3. The principal (or login service) then presents the token to SQL Server to verify that the principal is
authorized for access and, if correct, will create a session for the principal.
4. Once the connection is made, then SQL Server will create a list of all security tokens for each
Windows group the user belongs to. This is a recursive procedure so that nested Windows groups are
identified.
MCT USE ONLY. STUDENT USE PROHIBITED
1-16 SQL Server Security
5. If the user context changes to another database, a further list of security tokens is made. In other
words, SQL Server and WIF will make sure the user has access to the new database. If the user has
permission to access the new database, then the security token is added to the token list. As in the
previous step, the collection and building of the token list is repeated recursively.
6. The two lists contain security tokens representing all of the principals for the user, and each contains
a record of deny or grant permissions. This is the Active Permission Set.
For more information about WIF, see Windows Identity Foundation in the Microsoft .NET documentation:
Login tokens are created to authorize login credentials. The following code returns a row for each server
principal that is part of the login token:
For more information about sys.login_token, see sys.login_token (Transact-SQL) in Microsoft Docs:
sys.login_token (Transact-SQL)
http://aka.ms/Qmwvd2
User tokens are created to authorize user credentials. The following code returns a row for each server
principal that is part of the user token:
For more information about sys.user_token, see sys.user_token (Transact-SQL) in Microsoft Docs:
sys.user_token (Transact-SQL)
http://aka.ms/Lx2e6w
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-17
Create a login
Alter server roles
Create a user
Demonstration Steps
1. In SQL Server Management Studio, open the Security Tokens Demo.sql script file in the
D:\Demofiles\Mod01 folder. Note that in some cases, to view the results in SQL Server Management
Studio, right-click the containing node in Object Explorer, and then click Refresh to update the
objects.
4. In Object Explorer, under MIA-SQL under Security, expand Server Roles to view the new server
roles.
6. In Object Explorer, under MIA-SQL under Security, under Logins, view the new login.
9. In Object Explorer, under MIA-SQL expand Databases, expand AdventureWorks, expand Security,
expand Roles, and then expand Database Roles to view the new database roles.
10. Select the code under Step F, and then click Execute.
11. In Object Explorer, under MIA-SQL under Databases, under AdventureWorks, under Security,
expand Users to view the new user.
12. Select the code under Step G, and then click Execute.
13. Select the code under Step H, and then click Execute to view the current user and login tokens. The
code is executed in the security context of the login created for this demonstration. Notice that
tokens relating to the database user, both the MyExtDatabaseRole and MyDatabaseRole database
roles, and the public database role, are linked to the user.
14. Select the code under Step I, and then click Execute to remove all changes.
16. Keep SQL Server Management Studio open for the next demonstration.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Lesson 3
Authorization Across Servers
This lesson will describe multiserver authorization.
Lesson Objectives
After completing this lesson, you will be able to:
External Data Access. Enables access to data that is external to the SQL Server instance.
Diverse External Data Sources. You can work with many types of data sources within your
organization.
Standardized Method. The way you interact with other data sources is standardized.
Data Source. The object that contains the data you wish to work with. They are usually databases but
they can also be other data sources, such as spreadsheets.
Note:
o SQL Server has been tested against the Native Client OLE DB provider and other providers.
The Open Database Connectivity (ODBC) provider can enable secure connections to many other
data sources.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-19
1. Client Tier. The application that requires the data. This is where you will create the distributed
queries and present them to your SQL Server instance.
2. Server Tier. Your SQL Server instance directs the query to the relevant database via the OLE DB
provider DLLs—for example, a SQL Server database, Oracle and Microsoft Access. ODBC enables
connections to many other data sources, including Microsoft Excel.
3. Database Server Tier. These are the actual data sources. Note that you can use multiple SQL Server
databases for your distributed query.
Note: Third-party OLE DB providers must have a read and execute SQL Server service
account for all folders and subfolders in which the .dll is located.
You can use SQL Server Management Studio to add, change and remove linked server connection details:
Create. To set up a linked server, you use Object Explorer. Open the SQL Server instance, right-click
Server Objects, select New, then select Linked Server. You can then enter details in the New Linked
Server dialog.
View or Amend. Select the linked server, right-click and select Properties. The Linked Server
Properties dialog is displayed for you to change or view details.
Delete. To delete a linked server, you right-click the server and click Delete.
Stored procedures and catalog views can also be used to manage linked server connections:
sys.servers. You can query the system catalog views to return linked server details.
sp_dropserver. Deletes a linked server definition and can be used to remove references to remote
servers. Note that you can also drop remote logins using this stored procedure, as shown in the
following example.
sp_addlinkedsrvlogin and sp_droplinkedsrvlogin. These stored procedures allow you to map and
remove logins between your SQL Server instance and other server security accounts. See below for
further details.
MCT USE ONLY. STUDENT USE PROHIBITED
1-20 SQL Server Security
The following code example shows how the sp_addlinkedserver and sp_dropserver can be used to
create and drop a linked server for RemoteServer:
Note: Fully Qualified Naming. The following four-part name format is required in a
distributed query when referring to a linked server:
linked_server_name.catalog.schema.object_name.
For more information about linked servers, see Linked Servers (Database Engine) in Microsoft Docs:
When you run a stored procedure that queries a remote server or a distributed query linked server, there
must be a mapping between the requesting instance and the data sources.
Note: Windows Authentication Mode should be used if possible. However, some data
sources might not allow Windows Authentication Mode—for these, you can set up a Windows
authenticated login locally and map this to a specific login on the data source.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-21
The following code example shows how you can map and drop login mappings to RemoteServer for the
LocalUserName and RemoteUserName:
You can test linked server connections in SQL Server Management Studio by right-clicking on the linked
server and clicking Test Connection in the Properties dialog box.
This is the “double-hop” problem. To solve this, we need to consider how to make sure that SQL2 is aware
that USER1 has Windows Authentication that allows the query to execute. Delegation is required so that
SQL1 forwards the credentials of USER1 to SQL2. Delegation allows SQL1 to impersonate the user so that
SQL2 will run the distributed query.
Delegation Requirements
Client. The following list summarizes the client (USER1) requirements:
o Access Permissions. USER1 must have Windows Authenticated logins for SQL1 and SQL2.
o Active Directory. The Active Directory Account is sensitive and cannot be delegated property
must not be checked for USER1.
MCT USE ONLY. STUDENT USE PROHIBITED
1-22 SQL Server Security
o Delegation. The active SQL Server account must be trusted for delegation. See below for more
details.
For more information about double-hop, see Configuring Linked Servers for Delegation in Microsoft
TechNet:
Delegation
You have already seen how delegation works in the
previous topic Typical “Double-Hop” Problem. In this
case, a SQL Server request used a forward identity
to execute on a remote server.
Impersonation
Impersonation is carried out on the remote server in a similar way to if it were executed on the local
server.
Windows Authentication. This uses a Windows identity token using Kerberos or the Security
Support Provider Interface. The supplied identity token can then be cached on the service for use in
impersonation.
Service-For-User (S4U). Impersonation is carried out using a Windows identity token obtained from
Kerberos extensions. S4U is used when clients use non-Windows authentication.
Note: The process account must have the Act as part of the operating system access
right.
LogonUser API. An identity token can be requested and obtained from the LogonUser API. You can
use this Windows API to access remote services but delegation trust is not used.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-23
EXECUTE AS Example
SELECT SUSER_SNAME(), USER_NAME();
For more information about using EXECUTE AS, see EXECUTE AS (Transact-SQL) in Microsoft Docs:
EXECUTE AS (Transact-SQL)
http://aka.ms/Kho6sq
Orphaned Users
USE <MyMovedDatabase>;
GO;
SELECT dp.type_desc, dp.SID, dp.name AS user_name
FROM sys.database_principals AS dp
LEFT JOIN sys.server_principals AS sp
ON dp.SID = sp.SID
WHERE sp.SID IS NULL
AND authentication_type_desc = 'INSTANCE';
2. Resolve Orphaned Users. If any orphaned users are returned, you need to create logins for them
with Transact-SQL as in the following code example:
To resolve orphaned users, you use the CREATE LOGIN statement with the SID for the orphaned user:
3. Database Owner (dbo). You can use Transact-SQL to relink an orphaned dbo:
To change dbo to MyUser in the destination database, you use the ALTER AUTHORIZATION statement.
Change dbo
ALTER AUTHORIZATION ON DATABASES::MyDatabase TO MyUser;
GO
A SQL Server login that was sourced from a Windows account can access a database if the account is part
of a Windows group and is also a database user.
Demonstration Steps
1. In SQL Server Management Studio, open the MismatchedIDs.sql script file in the
D:\Demofiles\Mod01 folder.
2. Select the code under Step A, and then click Execute to run the orphaned users report in the TSQL
database. Two users should be returned. Note the SID for the appuser1 user.
3. Select the code under Step B, and then click Execute to demonstrate that an appuser login exists,
but has a different SID to that referenced by the appuser1 user.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-25
4. Select the code under Step C, and then click Execute to repair the appuser1 user by linking it to the
appuser login.
5. Select the code under Step D, and then click Execute to demonstrate that appuser1 is no longer an
orphaned user. Note the SID for the reportuser1 user.
6. Select the code under Step E, and then click Execute to create the reportuser login with a defined
SID value. The SID matches the SID returned in the orphaned users report for reportuser1.
7. Select the code under Step F, and then click Execute to demonstrate that no orphaned users remain.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Lesson 4
Partially Contained Databases
The containment feature in SQL Server reduces the reliance a database has on the SQL Server instance
that hosts it. This is very useful for optimizing the moving of a database to another SQL Server instance.
This lesson discusses the containment feature, how it is deployed, and the considerations for using
partially contained databases.
Lesson Objectives
After completing this lesson, you will be able to:
When a database is in development and the developer does not know which instance will ultimately
host the database.
When a database that participates in an Always On availability group is mirrored on multiple server
instances, and it is useful to be able to failover to a secondary instance without having to synchronize
the server-level logins required to access the database.
A contained database is one that is hosted on an instance of SQL Server, but which has no dependencies
on the server instance. Because there are no dependencies, you can move the database between servers,
or use it in availability group scenarios, without having to consider external factors, such as logins.
Containment Levels
Noncontained databases rely on their SQL Server instance for many features such as metadata,
authentication, collations, and resources. Most SQL Server databases are noncontained. Partially contained
databases have fewer dependencies on the hosting SQL Server instance. Partially contained databases are
mostly managed separately from the database engine instance. Partially contained databases allow the
use of some uncontained resources that are managed by the SQL Server instance and are outside of the
particular database boundary.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-27
Note: There is one further level of database containment: full containment. Fully contained
databases hold all the metadata, data objects, data, database object logins, user accounts, and
other resources they need to operate—this is usually held in the SQL Server Instance. These
databases are fully stand-alone. Full containment is not supported in SQL Server 2017.
Term Description
Database Boundary The boundary between a partially contained database and other
databases, and the SQL Server instance.
Contained Attribute Refers to an element that is held within the database boundary.
Uncontained Attribute An element that is not held within the database boundary.
Contained User This type of user is authenticated by the partially contained database.
Windows Principals The partially contained database will use Windows to authenticate
these users—they are trusted and do not require logins to be set up in
the master database.
The database contains users and can authenticate those users without reference to SQL Server logins.
Authentication can be performed by the database itself, or by trusting users that have been
authenticated by Windows.
In SSMS, you toggle containment using the Containment Type drop-down in the Options page of the
Database Properties dialog box.
The following code shows how you can create a new partially contained database using CREATE Transact-
SQL:
You can also change the containment level of an existing database; you can use the CONTAINMENT
parameter of the ALTER DATABASE Transact-SQL statement to enable or disable containment.
MCT USE ONLY. STUDENT USE PROHIBITED
1-28 SQL Server Security
The following code shows how to enable or disable containment in an existing table:
Alter Containment
-- Convert a database to partial containment
ALTER DATABASE MyDatabase SET CONTAINMENT = PARTIAL;
Contained Users
After creating a contained database, you can create contained users for that database. These users can be
one of two types:
Users with associated password. These users are authenticated by the database; the user’s password is
stored in the database metadata.
Users that are mapped to Windows user accounts. These users exist only in the database, with no
associated server-level login, and do not require the user to maintain a separate password. Instead of
performing its own authentication, the database trusts Windows authentication.
You create contained users by using the CREATE USER statement in the context of a contained database.
For more information about contained databases, see Contained Databases in Microsoft Docs:
Contained Databases
http://aka.ms/F30zfd
Note: To facilitate moving contained databases, you need to document the external
elements that are required for the database to function outside of the current SQL Server
instance. For example, a related database might need to be noted, in addition to the relationship
details. Similarly, external settings need to be identified and documented.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-29
Administration
To manage database settings for a noncontained database, an administrator will need sysadmin privilege
to change instance level settings. In a partially contained database, the administrator will have more
control over their own database. Administrators might still require access to noncontained elements.
Database Development
Development of new databases will take place away from the live environment. By using partially
contained databases, the developer can evaluate the effect that instance level elements might have on the
database.
The following considerations need to be made when designing, developing, or using partially contained
databases:
1. Cross Database Queries. Users with the same name and password on different partially contained
databases are not the same user.
3. Change Data Capture (CDC). CDC is not supported in partially contained databases.
5. Password Policy. CREATE USER does not support bypassing the password policy. If the partially
contained database does not have a password policy, this can cause problems after migration.
6. Synonyms. Some external dependencies might be omitted and cause issues when a partially
contained database is moved.
7. Connection Strings. You must specify the database name in application connection strings.
11. Collation. Partially contained databases do not rely on tempdb to hold collation details in the SQL
Server instance. Therefore, code that uses explicit collation might need to be altered to use
CATALOG_DEFAULT and not DATABASE_DEFAULT. Schema-bound objects are dependent on
collation changed built-in functions.
12. IntelliSense. An SSMS connection to a partially contained database with a contained user login does
not wholly support this feature.
13. SQL Server Data Tools (SSDT). SSDT is not aware of containment and will not inform you if
containment is not adhered to.
For further information about contained databases, see Contained Database Users – Making Your
Database Portable, Contained Database Collations, and Migrate to a Partially Contained Database in
Microsoft Docs:
http://aka.ms/aaew1r
http://aka.ms/Cw48th
http://aka.ms/Udr653
Demonstration Steps
View the Containment Value
1. In SQL Server Management Studio, open ContainedDatabase.sql in the D:\Demofiles\Mod01
folder.
2. Select the code under Step A, and then click Execute. Note that the value returned is ‘1’ as
containment should be enabled.
3. Select the code under Step B, and then click Execute. Note that the value_in_use is ‘0’ (containment
is disabled). To confirm this:
b. In the Server Properties - MIA-SQL dialog box, on the Advanced page, note the Enable
Contained Databases attribute is False, and then click Cancel.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 1-31
4. Select the code under Step C, and then click Execute. Note that the value_in_use is set back to ‘1’.
To confirm this:
b. In the Server Properties - MIA-SQL dialog box, on the Advanced page, note the Enable
Contained Databases attribute is True, and then click Cancel.
6. In Object Explorer, under MIA-SQL, right-click Databases, and then click Refresh to view a list of
databases, including the new partially contained database.
7. Select the code under Step E, and then click Execute. The new users will be returned from the SELECT
statement.
8. In Object Explorer, under MIA-SQL, under Databases, expand PClientData, expand Views, and
expand System Views, right-click sys.database_principals, and then click Select Top 1000 Rows.
Confirm the new users have been created in the list of contained users.
9. In the ContainedDatabase.sql tab, select the code under Step F, and then click Execute.
10. Close SQL Server Management Studio, without saving any changes.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
A SQL Server login for the Sales Support application. The login should have the following properties:
o Password: Pa55w.rd
2. Ensure that the authentication mode for the MIA-SQL SQL Server instance is set appropriately to
support the requirements.
Results: After this exercise, you should have verified the authentication modes supported by the MIA-
SQL instance, and created three logins.
2. Use SSMS Object Explorer to verify that a user called ADVENTUREWORKS\WebApplicationSvc has
been created in the AdventureWorks database.
MCT USE ONLY. STUDENT USE PROHIBITED
1-34 SQL Server Security
2. Using SSMS, check the properties of the SalesSupport login to verify that it is mapped to the
ServiceUser user.
Results: At the end of this exercise, you will have created three database users and mapped them to the
logins you created in the previous exercise.
Your task is to investigate and correct the issues with the LegacySalesLogin.
For more information about troubleshooting this issue, see MSSQLSERVER_18456 in Microsoft Docs:
MSSQLSERVER_18456
http://aka.ms/N2n2j6
2. Check the SQL Server log to determine the cause of the failure.
InternetSalesApplication
o ADVENTUREWORKS\WebApplicationSvc
o InternetSalesApplication
2. Rerun the orphaned user report to verify that no orphaned users remain.
Review Question(s)
Module 2
Assigning Server and Database Roles
Contents:
Module Overview 2-1
Lesson 1: Working with Server Roles 2-2
Module Overview
Using roles simplifies the management of user permissions. With roles, you can control authenticated
users’ access to system resources based on each user’s job function—rather than assigning permissions
user-by-user, you can grant permissions to a role, then make users members of roles.
Microsoft® SQL Server® includes support for security roles defined at server level and at database level.
Note: The types of roles available to you will vary, depending on whether you are using a
full installation of SQL Server (running on-premises or in the cloud) or Azure® SQL Database.
Azure SQL Database does not offer server-level roles.
For a comparison of the security features offered by SQL Server and Azure SQL Database, see Security
Center for SQL Server Database Engine and Azure SQL Database in Microsoft Docs:
Security Center for SQL Server Database Engine and Azure SQL Database
http://aka.ms/rxh35d
Note: Server and database roles should be assigned permissions on the principle of least
privilege; each role should hold the least permissions possible for it to function normally.
Objectives
After completing this module, you will be able to:
Use custom database roles and application roles to manage database-level security.
MCT USE ONLY. STUDENT USE PROHIBITED
2-2 Assigning Server and Database Roles
Lesson 1
Working with Server Roles
Roles defined at server level are used to control access to server-level permissions. This lesson covers
server-level permissions, the purposes of the built-in server roles, and how to create server-level roles that
are customized to your requirements.
Lesson Objectives
After completing this lesson, you will be able to:
Server-Scoped Permissions
Permissions at the server level generally relate to
administrative actions, such as creating databases,
altering logins, or shutting down the server. Some
fundamental permissions, such as CONNECT SQL
(which permits a login to connect to the SQL Server
Database Engine), are also managed at server level.
You can view a complete list of server-scoped
permissions using the system function
sys.fn_builtin_permissions:
sys.fn_builtin_permissions Server
SELECT *
FROM sys.fn_builtin_permissions('SERVER')
ORDER BY permission_name;
Server permissions are organized as a hierarchy; when you grant a server-level permission to a server
principal (either a login or a role), this implicitly also grants the child permissions of the granted
permission to the principal.
The topmost node in the hierarchy of server-level permissions (and therefore the server permission that
encompasses all other server permissions) is CONTROL SERVER.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-3
You can visualize the hierarchy of server permissions with the following query:
UNION ALL
Server permissions can only be granted to server principals (logins and server roles).
For more information on sys.fn_builtin_permissions, see sys.fn_builtin_permissions (Transact-SQL) in
Microsoft Docs:
sys.fn_builtin_permissions (Transact-SQL)
http://aka.ms/uk6ymp
CONTROL SERVER
This permission grants administrative control over all databases on a SQL Server instance. It implicitly
grants the CREATE ANY DATABASE permission.
This permission grants administrative control over any login, including logins for linked servers.
This permission grants permission to execute DBCC commands that affect the contents of the buffer
pool (DBCC FREE*CACHE and DBCC SQLPERF). It implicitly grants the VIEW SERVER STATE permission.
ALTER SETTINGS
This permission grants EXECUTE permissions on sp_configure and the RECONFIGURE command.
ALTER TRACE
The following table shows a list of the fixed server-level roles with a description of the permissions
granted to role members:
sysadmin This role grants permissions to perform any action on the server. You
should limit membership of this role as far as possible.
serveradmin This role grants permissions to configure server-wide settings and to shut
down the server.
securityadmin This role grants permissions to manage logins. This includes the ability to
create and drop logins and the ability to assign permissions to logins.
Members of this role can grant and deny server-level permissions to
other users and grant and deny database-level permissions to other users
on any database to which they have access. Because of the ability to
assign permissions to other users, membership of this role should be
limited as much as possible. This role should be treated as equivalent to
sysadmin.
processadmin This role grants permissions to terminate sessions running on the SQL
Server instance.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-5
bulkadmin This role grants permissions to execute the BULK INSERT statement.
public Every user is a member of public and this cannot be changed. This role
does not initially grant any administrative permissions. Though you can
add permissions to this role, this is not advisable because the permissions
would be granted to every user.
To keep to the principle of least privilege, you should avoid using fixed server roles as far as possible.
Unless a fixed server role holds exactly the permissions required for a server principal, you should create a
user-defined server role with only the permissions that the principal requires.
The exception to this is the sysadmin role. The CONTROL SERVER permission is not directly equivalent to
membership of the sysadmin role, and there are more than 150 system stored procedures and functions
that explicitly check for membership of sysadmin before executing.
Note: Unlike in some earlier versions of SQL Server, in SQL Server 2017 the
BUILTIN\administrators and Local System (NT AUTHORITY\SYSTEM) accounts are not
automatically added as members of the sysadmin role, although you can add them manually if
required. Note that this does not affect the ability of local administrators to access the database
engine when it is in single-user mode.
To view membership of server-level roles, you can query the sys.server_role_members system view:
Role Membership
SELECT spr.name AS role_name, spm.name AS member_name
FROM sys.server_role_members AS rm
JOIN sys.server_principals AS spr
ON spr.principal_id = rm.role_principal_id
JOIN sys.server_principals AS spm
ON spm.principal_id = rm.member_principal_id
ORDER BY role_name, member_name;
For more information on fixed server roles and the server-level permissions assigned to them, see Server-
Level Roles in Microsoft Docs:
Server-Level Roles
http://aka.ms/fxwahr
MCT USE ONLY. STUDENT USE PROHIBITED
2-6 Assigning Server and Database Roles
public Permissions
By default, the public role has CONNECT and VIEW ANY DATABASE permissions. Unlike other fixed server
roles, you can grant additional permissions to public, but when doing so, you should be aware that you
are effectively granting the permission to every login.
When you create a server role, you might optionally define an owner for the role with the
AUTHORIZATION clause. The role owner will have permission to add and remove members from the role.
If you do not explicitly define an owner for the role, the role owner is the server principal executing the
CREATE SERVER ROLE statement.
Unlike fixed server roles, the members of a user-defined server role do not, by default, have permission to
add other security principals as members of the role.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-7
For more information on the CREATE SERVER ROLE statement, see CREATE SERVER ROLE (Transact-SQL)
in Microsoft Docs:
You can remove a server role with the DROP SERVER ROLE statement.
Managing Permissions
You can grant a user-defined role permission to server-level objects with Transact-SQL using the GRANT,
DENY and REVOKE commands, or through the SSMS GUI.
In the following example, the ALTER TRACE permission is granted to the app_admin role:
For more information on granting server-level permissions, see GRANT (Transact-SQL) in Microsoft Docs:
GRANT (Transact-SQL)
https://aka.ms/Ke2635
Managing Membership
You can add and remove logins to server-level roles by using SSMS or the ALTER SERVER ROLE Transact-
SQL statement.
In the following code example, the ADVENTUREWORKS\WebAdmins login is added to the app_admin
server-level role:
You can also create hierarchies of role membership by making a user-defined server role a member of
another server role.
Best Practice: It is not recommended that you make user-defined server roles members of
fixed server roles. Doing so will give control over membership of the fixed server role to members
of the user-defined server role, which may lead to unintended escalation of privileges.
To remove a role member, use the ALTER SERVER ROLE statement with the DROP MEMBER clause.
For more information on the ALTER SERVER ROLE statement, see ALTER SERVER ROLE (Transact-SQL) in
Microsoft Docs:
Server-scoped permissions.
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
2. Run Setup.cmd in the D:\Demofiles\Mod02 folder as Administrator. In the User Account Control
dialog box, click Yes.
3. Start SQL Server Management Studio, and connect to the MIA-SQL database engine instance using
Windows® authentication.
4. On the File menu, point to Open, and then click Project/Solution.
6. In Solution Explorer, expand Queries, and then double-click Demo 1 - server roles.sql.
7. Execute the code under the heading for Step 1 to show the permission hierarchy.
8. Execute the code under the heading for Step 2 to create two logins.
9. Execute the code under the heading for Step 3 to show that a new login is a member of public.
10. Execute the code under the heading for Step 4 to demonstrate the permissions of a new login.
11. Execute the code under the heading for Step 5 to add a login to the diskadmin role.
12. Execute the code under the heading for Step 6 to verify the membership created in the previous step.
13. Execute the code under the heading for Step 7 to create a user-defined server role.
14. Execute the code under the heading for Step 8 to grant permissions to the new role.
15. Execute the code under the heading for Step 9 to make a login a member of the new role.
16. Execute the code under the heading for Step 10 to verify the membership created in the previous
step.
17. Execute the code under the heading for Step 11 to show the permissions of the login.
18. Execute the code under the heading for Step 12 to remove the logins and role.
serveradmin
securityadmin
processadmin
setupadmin
bulkadmin
MCT USE ONLY. STUDENT USE PROHIBITED
2-10 Assigning Server and Database Roles
Lesson 2
Working with Fixed Database Roles
In SQL Server, roles are supported at database level. The purpose of database-level roles is similar to the
purpose of the server-level roles that you have already learned about in this module—to simplify the
administration of database permissions. This lesson covers the fixed database roles available in every SQL
Server database.
Lesson Objectives
After completing this lesson, you will be able to:
Database-Scoped Permissions
Permissions at database level relate to:
Administrative actions related to the
management of database objects (for example,
BACKUP DATABASE, and commands to alter or
create database objects).
sys.fn_builtin_permissions Database
SELECT *
FROM sys.fn_builtin_permissions('DATABASE')
ORDER BY permission_name;
Like server permissions, database permissions are organized in a hierarchy. The hierarchy has two top-
level nodes—CONTROL, which is the parent node for all other database permissions, and CREATE
DATABASE, which has no children. Although database permissions can only be granted to database
principals (users and roles), some server-level permissions implicitly grant database permissions.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-11
You can visualize the hierarchy of database permissions with the following query. The
parent_server_permission column shows the server-level permission which implicitly grants the database
permission:
UNION ALL
db_owner Members of this role have comprehensive rights over the database,
equivalent to the owner of the database. This includes the right to
fully manage the database and also to drop the database, so you
must use this role with caution.
db_accessadmin Members of this role can manage access to the database by Windows
logins, Windows groups, and SQL Server logins.
db_backupoperator Members of this role can back up the database. Note that this role
does not have the right to restore the database.
MCT USE ONLY. STUDENT USE PROHIBITED
2-12 Assigning Server and Database Roles
db_ddladmin Members of this role can run any Database Definition Language
(DDL) Transact-SQL commands in the database. DDL is the portion of
the Transact-SQL language that deals with creating, altering, and
deleting database objects.
db_datawriter Members of this role can change (INSERT, UPDATE, and DELETE) data
in the database.
db_datareader Members of this role can read data from all database tables.
db_denydatawriter Members of this role cannot change (INSERT, UPDATE, and DELETE)
data in the database.
db_denydatareader Members of this role cannot read data from any database tables.
public All database users belong to the public role. When a user has no
explicit or inherited permissions on a database object, the permissions
granted to public will be used.
To keep to the principle of least privilege, you should avoid using fixed database roles as far as possible.
Unless a fixed database role holds exactly the permissions needed for a group of database principals, you
should create a user-defined database role with only the permissions the principals require.
To view membership of database-level roles you can query the sys.database_role_members system view:
Database-Level Roles
http://aka.ms/q4b8tn
dbmanager. Members of this role in the master database may create, alter, and drop databases on
the Azure SQL Database instance.
loginmanager. Members of this role in the master database may create, alter, and drop logins on
the Azure SQL Database instance.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-13
For more information on security configuration in Azure SQL Database, see Controlling and granting
database access in the Microsoft Azure documentation:
Note that:
You may make a database user or a user-defined database role a member of a database role.
You may not make a fixed database role a member of another database role.
Membership of database roles is limited to database principals. Server principals cannot be made
members of database roles.
For more information on the ALTER ROLE command, see ALTER ROLE (Transact-SQL) in Microsoft Docs:
Database Owner
When discussing database ownership in SQL Server,
there are two related concepts you must
understand:
The dbo user has a default schema in the database, also called dbo. Database objects created by
members of the sysadmin fixed server role are added by default to the dbo schema.
The fixed database role db_owner is a role whose members have full administrative permissions for a
database. The dbo user is, by default, a member of the db_owner role, but you may add as many
members to the db_owner role as you wish. Amongst other permissions, members of the db_owner role
have permission to create, delete, and alter database objects in any schema, including the dbo schema.
Demonstration Steps
1. In SQL Server Management Studio, in Object Explorer, click Connect, and then click Database
Engine.
2. In the Connect to Server dialog box, in the Server name box, type the name of your Azure instance
running the AdventureWorksLT database, for example, <servername>.database.windows.net.
3. In the Authentication list, click SQL Server Authentication, in the Login box, type Student, in the
Password box, type Pa55w.rd, and then click Connect.
5. On the Query menu, point to Connection, and then click Change Connection.
6. In the Connect to Database Engine dialog box, in the Server name list, click
<servername>.database.windows.net, in the Password box, type Pa55w.rd, and then click
Connect.
7. Execute the code under the heading Step 1 to create a new login.
8. On the toolbar, in the Available Databases list, click AdventureWorksLT.
9. Execute the code under the heading for Step 2 to create a new user.
10. Execute the code under the heading for Step 3 to demonstrate the database permissions hierarchy.
11. Execute the code under the heading for Step 4 to demonstrate that the user is a member of the
public database role by default.
12. Execute the code under the heading for Step 5 to add the user to two fixed database roles.
13. Execute the code under the heading for Step 6 to verify the user’s role memberships.
14. Execute the code under Step 7 to remove the demonstration objects.
15. On the Query menu, point to Connection, and then click Change Connection.
16. In the Connect to Database Engine dialog box, in the Server name list, click
<servername>.database.windows.net, in the Password box, type Pa55w.rd, and then click
Connect.
19. Keep SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-15
Lesson 3
User-Defined Database Roles
When you want finer-grained control over permissions on database objects than is offered by the fixed
database roles, you can define your own database roles and grant them only the permissions they need.
This lesson covers two types of user-defined database roles supported by SQL Server—database roles and
application roles.
Lesson Objectives
After completing this lesson, you will be able to:
For more information on the CREATE ROLE statement, see CREATE ROLE (Transact-SQL) in Microsoft Docs:
You can remove a database role with the DROP ROLE statement. You cannot remove a role that is the
owner of one or more securable objects.
Managing Permissions
You may grant user-defined role permissions to database objects with Transact-SQL using the GRANT,
DENY and REVOKE commands or through the SSMS GUI.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-17
In the following example, the SELECT permission on the Production.Product table is granted to the
product_reader role:
For more information on commands used to manage database permissions, see GRANT(Transact-SQL) in
Microsoft Docs:
GRANT (Transact-SQL)
https://aka.ms/Ke2635
Managing Membership
You can add and remove logins to database-level roles by using SSMS or the ALTER ROLE Transact-SQL
statement.
In the following code example, the WebApp user is added to the product_reader role:
You may also create hierarchies of role membership by making a user-defined database role a member of
another database role.
To remove a role member, use the ALTER ROLE statement with the DROP MEMBER clause.
For more information on the ALTER ROLE statement, see ALTER ROLE (Transact-SQL) in Microsoft Docs:
Note: All the examples in this topic use a role called example_role. The code needed to
create this role is not shown.
MCT USE ONLY. STUDENT USE PROHIBITED
2-18 Assigning Server and Database Roles
You might grant a role permission on a database schema; the permissions you grant at schema level
automatically apply to all the objects in the schema:
You might grant a role permission at database level; the permissions you grant at database level
automatically apply to all the objects in the database:
Note: The SHOWPLAN permission allows a database principal to view the query execution
plan for queries that the principal has permission to execute.
In a full SQL Server installation, to view the contents of the query plan cache, you must have the
server-level VIEW SERVER STATE permission; VIEW SERVER STATE implicitly grants SHOWPLAN at
database level.
In Azure SQL Database, there is no VIEW SERVER STATE permission. On premium performance
tiers, the equivalent permission is VIEW DATABASE STATE. For other performance tiers, only the
admin account may view the plan cache.
This approach is useful when you want to add database permission scripts to a source control system
during application development.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-19
Demonstration Steps
1. In SQL Server Management Studio, in Solution Explorer, double-click Demo 3 - user database
roles.sql.
2. On the Query menu, point to Connection, and then click Change Connection.
3. In the Connect to Database Engine dialog box, in the Server name list, click MIA-SQL, in the
Authentication box, click Windows Authentication, and then click Connect.
4. Execute the code under the heading Step 1 to create a user-defined database role.
5. Execute the code under the heading Step 2 to grant SELECT permissions on the HumanResources
schema to the role.
6. Execute the code under the heading Step 3 to verify the role permissions.
7. Execute the code under the heading Step 4 to add two users to the role.
8. Execute the code under the heading Step 5 to verify the role’s membership.
9. Execute the code under the heading Step 6 to remove the demonstration role.
Note: An application role completely replaces a user’s security context. None of the user’s
permissions apply when they activate an application role.
MCT USE ONLY. STUDENT USE PROHIBITED
2-20 Assigning Server and Database Roles
You might use an application role if you want to use Windows authentication to give users access to a
database, but do not want to grant them the same permissions as the applications they use. For example,
members of a Windows group use an application that requires full access to tables in the Purchasing
schema. You want the users to be able to access these tables through the application, but do not want
them to be able to execute impromptu SELECT, INSERT and UPDATE statements. An application role
provides one method of achieving this.
For more information about application roles, see Application Roles in Microsoft Docs:
Application Roles
http://aka.ms/xoywhz
You create application roles using the CREATE APPLICATION ROLE statement:
For more information on the CREATE APPLICATION ROLE statement, see CREATE APPLICATION ROLE
(Transact-SQL) in Microsoft Docs:
You can drop application roles with the DROP APPLICATION ROLE statement.
Note: If calls to the sp_setapprole stored procedure will be made across a network, you
must ensure that the connection is encrypted—for example, using SSL or IPSec—to avoid
exposing the application role password, because the password is effectively sent in plain text.
sp_setapprole accepts an optional @encrypt = 'odbc' parameter, but this uses an ODBC function
that obfuscates the password rather than truly encrypting it. The ODBC encrypt function is not
supported by the SqlClient library.
An application role remains active until the user disconnects from SQL Server or it is deactivated by using
the sp_unsetapprole stored procedure. However, to use sp_unsetapprole, you must specify a cookie that
was generated when the application role was activated.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-21
The following code example shows how to create a cookie when activating an application role, and how
to use the cookie to deactivate the application role when it is no longer required:
Note: If they do not store application role cookies and revert to the connection security
context using sp_unsetapprole, applications using an application role must disable connection
pooling.
sp_setapprole (Transact-SQL)
http://aka.ms/homihf
Application roles are database principals that are not linked to a server principal. As a result, application
roles are restricted to the permissions granted to the guest user in other databases on the instance. In
databases where the guest user is disabled (the default behavior), an application role has no access to the
database at all.
Demonstration Steps
1. In SQL Server Management Studio, in Solution Explorer, double-click Demo 4 - application roles.sql.
2. On the Query menu, point to Connection, and then click Change Connection.
3. In the Connect to Database Engine dialog box, in the Server name list, click MIA-SQL, in the
Authentication list, click Windows Authentication, and then click Connect.
4. Execute the code under the heading Step 1 to create an application role.
5. Execute the code under the heading Step 2 to grant permissions to the role.
6. Execute the code under the heading Step 3 to demonstrate behavior before the application role is
activated.
7. Execute the code under the heading Step 4 to activate the application role.
8. Execute the code under the heading Step 5 to demonstrate that the application role permissions
apply.
9. Execute the code under the heading Step 6 to show how the user’s identity is represented.
10. Execute the code under the heading Step 7 to show that cross-database access is limited.
11. Execute the code under the heading Step 8 to exit the application role.
MCT USE ONLY. STUDENT USE PROHIBITED
2-22 Assigning Server and Database Roles
12. Execute the code under the heading Step 9 to show how the user’s identity is represented.
13. Execute the code under the heading Step 10 to remove the application role.
Sequencing Activity
To indicate the correct order, number each of the following steps, which describe the sequence of actions
when an application uses an application role.
Steps
Objectives
After completing this lab, you will be able to:
Implement and manage fixed server roles and user-defined server roles.
Password: Pa55w.rd
The adventureworks.msft domain includes the following global group relevant to this task:
ADVENTUREWORKS\Database_Managers:
o ADVENTUREWORKS\IT_Support
A server login has already been created for the ADVENTUREWORKS\Database_Managers on the MIA-
SQL instance, but no permissions have been assigned.
3. Under the heading for Task 1, write a script to create a server role called database_manager.
Results: At the end of this exercise, you will have created the database_manager server role, granted
permissions to members to alter any login and alter any database, and granted membership to the
members of the Database_Managers login.
o db_accessadmin
o db_backupoperator
3. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have mapped the Database_Managers login to the
salesapp1 database and added them to the db_backupoperator and db_accessadmin roles.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-25
ADVENTUREWORKS\InternetSales_Users
ADVENTUREWORKS\InternetSales_Managers
Both of these groups already have a login to the MIA-SQL instance, but are not configured to access the
salesapp1 database.
o sales_order_writer
MCT USE ONLY. STUDENT USE PROHIBITED
2-26 Assigning Server and Database Roles
2. Write and execute a query to grant UPDATE permissions on the Sales.Orders table and the
Sales.OrderDetails table to the sales_order_writer role.
3. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have created user-defined database roles and assigned them
to database principals.
3. In the SQLCMD window, enter the following commands to verify your identity:
SELECT SUSER_NAME();
GO
Note that SQL Server identifies Windows group logins using their individual user account, even
though there is no individual login for that user. ADVENTUREWORKS\AnthonyFrizzell is a member
of the ADVENTUREWORKS\IT_Support global group, which is in turn a member of the
ADVENTUREWORKS\Database_Managers domain group.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-27
4. In the SQLCMD window, execute an ALTER LOGIN to change the password of the login for the
Marketing_Application login. Your code should look similar to this:
7. In SSMS, view the properties of the ADVENTUREWORKS\WebApplicationSvc login, verify that the
login is disabled, and then re-enable it.
4. Verify that you cannot update the Sales.Orders table in the salesapp1 database. For example,
execute the following query:
This command should return an error if the role permissions are correctly applied.
3. Verify that you can query the Sales.Orders table in the salesapp1 database. For example, execute the
following query:
4. Verify that you can query the Production.Suppliers table in the salesapp1 database. For example,
execute the following query:
5. Verify that you can update the Sales.Orders table in the salesapp1 database. For example, execute
the following query:
Results: At the end of this exercise, you will have verified your new security settings.
Question:
Your organization wants to track data access by individual Windows users. Does this mean
you cannot base logins on Windows groups?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 2-29
Best Practice: When implementing role-based security in SQL Server, consider the
following best practices:
Use Windows group logins linked to roles to simplify ongoing management where possible.
Aim to grant the minimum number of explicit permissions possible to meet the security requirements,
and use membership of roles and inheritance to ensure the correct effective permissions.
Ensure every database user has only the permission they actually require.
Review Question(s)
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Module 3
Authorizing Users to Access Resources
Contents:
Module Overview 3-1
Lesson 1: Authorizing User Access to Objects 3-2
Module Overview
In the previous modules, you have seen how Microsoft® SQL Server® security is organized and how sets
of permissions can be assigned at the server and database level by using fixed server roles, user-defined
server roles, fixed database roles, and application roles. The final step in authorizing users to access SQL
Server resources is the authorization of users and roles to access server and database objects.
In this module, you will see how these object permissions are managed. In addition to access permissions
on database objects, SQL Server provides the ability to determine which users are allowed to execute
code, such as stored procedures and functions. In many cases, these permissions and the permissions on
the database objects are best configured at the schema level rather than at the level of the individual
object. Schema-based permission grants can simplify your security architecture. You will explore the
granting of permissions at the schema level in the final lesson of this module.
Objectives
After completing this module, you will be able to:
Lesson 1
Authorizing User Access to Objects
Before moving on to managing permissions on code, you need to consider how permissions are managed
on database objects. SQL Server has a fine-grained security model that means you can grant the minimum
permissions to users that will allow them to do their work. In particular, permissions can be granted at the
column level, not just at the table and view level. You will also see how you can delegate the work of
granting permissions to other users.
Lesson Objectives
After completing this lesson, you will be able to:
At the database level, principals include database users, fixed and user-defined database roles, and
application roles.
Every principal has two numeric IDs associated with it—a principal ID and a security identifier (SID).
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 3-3
DENY
In ANSI (American National Standards Institute) SQL, if a user does not have permission to perform an
action, they cannot do so. Therefore, ANSI SQL does not provide a DENY statement.
Because SQL Server is closely linked with the Windows operating system and its group membership
system, a Windows user can receive permissions through their group membership. Therefore, there may
be cases where a user has a permission granted through their group membership that you might want to
remove for the individual.
To support this scenario, SQL Server provides the DENY statement, which you can use to explicitly deny a
user a permission that they may receive from membership of a group or role. This is very similar to the
process for denying permissions work in Windows. For a Windows example, consider that you could
decide that all members of the Salespeople group can access a color printer—except Holly (who is a
member of Salespeople), because she causes problems with it. You grant access to the Salespeople group
then deny it to Holly. SQL Server works in the same way. You could grant SELECT permission on a table to
every member of the Salespeople role, and then deny Holly access to that table.
As with Windows, you should use DENY sparingly. If you need to DENY many permissions, it might
indicate a potential problem with your security design.
MCT USE ONLY. STUDENT USE PROHIBITED
3-4 Authorizing Users to Access Resources
Note: The REVOKE statement revokes both GRANT and DENY statements.
Applying Permissions
USE MarketDev;
GO
Note that two forms of the command are shown. While the full terminology involves OBJECT:: as a prefix,
this is optional. In the second example, the same GRANT statement is shown without the OBJECT:: prefix.
It is not strictly necessary to specify the schema for the table or view, but doing so is highly recommended
to ensure that permissions are granted on the intended object. If the schema name is not specified, the
default schema for the user granting the permission is used. If the object is not found in the user's default
schema, the dbo schema is used instead.
REFERENCES
While the meaning of the SELECT, INSERT, UPDATE, and DELETE permissions will likely be obvious to you,
the meaning and purpose of the REFERENCES permission might not. You need to use the REFERENCES
permission before a foreign key relationship can specify the object as a target, and is only required if no
other permissions exist on the referenced object.
Column-Level Security
In addition to assigning permissions at table or
view level, you can also allocate column-level
permissions. This provides a more granular level of
security for data in your database.
User1 can now access two columns in the Marketing.Salesperson table, but not the whole table.
Row-Level Security
Row-level security (RLS) helps you to control
access to rows in a table. This can be useful in a
variety of scenarios, including when you want a
salesperson to only access customer data for
customers in their region; or when you want to
limit an employee to only access data relevant to
their department. One key advantage of this is
that the logic for the separation is in the actual
database. This reduces the risk of errors in the
application that is allowing inappropriate access,
and simplifies the security implementation for that
table.
Functionally, this is similar to horizontal partitioning of data or including a WHERE clause in a query. You
implement RLS by adding a security predicate defined as an inline table-valued function that returns 1
when the predicate is met.
For more information about RLS, see Row-Level Security in Microsoft Docs:
Row-Level Security
http://aka.ms/Br9x2d
MCT USE ONLY. STUDENT USE PROHIBITED
3-6 Authorizing Users to Access Resources
CASCADE
The challenge of the WITH GRANT OPTION clause comes when you need to REVOKE or DENY the
permission that you granted to James using the WITH GRANT OPTION. You do not know which other
users James has already granted the permission to.
When revoking or denying a permission, you can use the CASCADE clause to also revoke or deny
permissions from any users who had been granted them by User1.
In this example, the REVOKE statement will fail if you omit the CASCADE clause, because the GRANT
statement included the WITH GRANT OPTION clause.
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and MIA20764C-MIA-SQL virtual
machines are running, and log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
8. Execute the code under the heading for Step 1 to create a user for the demonstration.
9. Execute the code under the heading for Step 2 to query the list of server principals. Note
Mod03Login at the end of the list.
10. Execute the code under the heading for Step 3 to query the list of database principals. Again, note
Mod03Login in the list.
11. Execute the code under the heading for Step 4 to grant SELECT permissions on the Product table to
Mod03Login.
12. Execute the code under the heading for Step 5 to change the execution context.
13. Execute the code under the heading for Step 6 to test the permissions. Note that you can select from
the Product table that you were granted permissions on, but not from the ProductInventory table.
14. Execute the code under the heading for Step 7 to revert the execution context.
15. Execute the code under the heading for Step 8 to grant SELECT permissions on specific columns in
the ProductInventory table to Mod03Login.
16. Execute the code under the heading for Step 9 to change the execution context.
17. Execute the code under the heading for Step 10 to test the permissions. Note that the first query to
select the two specific columns executes, but you cannot select all the columns from the
ProductInventory table.
18. Execute the code under the heading for Step 11 to revert the execution context.
19. On the File menu, click Close.
20. Leave SQL Server Management Studio open for the next demonstration.
Lesson 2
Authorizing Users to Execute Code
In addition to providing you with control over who accesses data in your database or the objects in your
server, SQL Server helps you to control which users can execute your code. Appropriate security control of
code execution is an important aspect of your security architecture.
In this lesson, you will see how to manage the security of stored procedures and functions. You will also
learn how to manage security for code that lives in .NET-managed code assemblies that are used with SQL
CLR integration. Finally, you will see how ownership chains affect the security relationship between code
and database objects.
Lesson Objectives
After completing this lesson, you will be able to:
The ALTER permission enables a user to change the definition of a stored procedure.
The VIEW DEFINITION permission enables a user to view the code definition of the stored procedure.
Note: You can use SSMS or Transact-SQL to grant permissions on user stored procedures,
but you can only grant permissions on system stored procedures by using Transact-SQL code.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 3-9
It is uncommon to directly update a TVF. It is possible, however, to assign INSERT, UPDATE, and
DELETE permissions on one form of TVF known as an inline TVF—in some cases, this particular form
can be updated.
In addition to these permissions, there are scenarios where you also need to assign the REFERENCES
permission to users so that they can correctly execute a UDF. These scenarios include functions that:
Note: Assigning permissions to the public role on stored procedures is not appropriate.
MCT USE ONLY. STUDENT USE PROHIBITED
3-10 Authorizing Users to Access Resources
Permission Sets
No matter what .NET Framework code is included in an assembly, the actions the code can execute are
determined by the permission set specified when creating the assembly.
The SAFE permission set strictly limits the actions that the assembly can perform and inhibits it from
accessing external system resources. Code using this permission set can access the local instance of
SQL Server by using a direct access path, called a context connection. The SAFE permission set is the
default.
The EXTERNAL_ACCESS permission set allows the code to access local and network resources,
environment variables, and the registry. EXTERNAL_ACCESS is even necessary for accessing the same
SQL Server instance if a connection is made through a network interface.
The UNSAFE permission set relaxes many standard controls over code so you should avoid using it.
The EXTERNAL_ACCESS and UNSAFE permission sets require additional setup. You cannot specify the
requirement for an EXTERNAL_ACCESS permission set when executing the CREATE ASSEMBLY statement.
You should flag the database as TRUSTWORTHY (which is easy, but not recommended) or create an
asymmetric key from the assembly file in the master database, create a login that maps to the key, and
grant the login EXTERNAL ACCESS ASSEMBLY permission on the assembly.
Having the same owner for all objects in a schema (which itself also has an owner) simplifies permission
management. However, it is still important to understand that ownership chain problems can occur and
you should know how to resolve them.
Ownership chaining applies to stored procedures, views, and functions. The slide shows an example of
how ownership chaining applies to views or stored procedures.
2. User2 creates a view that accesses the table and grants User1 permission to access the view. Access is
granted as User2 is the owner of both the top level object (the view) and the underlying object (the
table).
3. User2 then creates a view that accesses a table owned by User3. Even if User2 has permission to
access the table and grants User1 permission to use the view, User1 will be denied access because of
the broken chain of ownership from the top level object (the view) to the underlying object (the
table).
4. However, if User3 grants User1 permissions directly on the underlying table, he can then access the
view that User2 created to access that table.
Demonstration Steps
1. In SQL Server Management Studio, on the File menu, point to Open, and then click File.
2. In the Open File dialog box, navigate to D:\Demofiles\Mod03, click
AuthorizingUsersToExecuteCode.sql, and then click Open.
3. Execute the code under the heading for Step 1 to change database context.
4. Execute the code under the heading for Step 2 to change execution context.
5. Execute the code under the heading for Step 3 to try to execute the uspGetManagerEmployees
stored procedure. Note that permission is denied.
6. Execute the code under the heading for Step 4 to revert the execution context.
7. Execute the code under the heading for Step 5 to grant EXECUTE permissions for the stored
procedure.
8. Execute the code under the heading for Step 6 to change execution context.
9. Execute the code under the heading for Step 7 to try to execute the uspGetManagerEmployees
stored procedure again. Note that this time the code executes.
10. Execute the code under the heading for Step 8 to try to execute the ufnGetStock function. Note that
permission is denied.
11. Execute the code under the heading for Step 9 to revert the execution context.
12. Execute the code under the heading for Step 10 to grant EXECUTE permissions on the function.
13. Execute the code under the heading for Step 11 to change the execution context and test the new
permission. Note that the function now works as expected.
15. Leave SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
3-12 Authorizing Users to Access Resources
CHANGE
CHANGE DEFINITION
ALTER
ALTER DEFINITION
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 3-13
Lesson 3
Configuring Permissions at the Schema Level
Schemas are used as containers for objects such as tables, views, and stored procedures. They can be
particularly helpful in providing a level of organization and structure when large numbers of objects exist
in a database. You can also assign security permissions at the schema level, rather than individually on the
objects contained in the schemas. Doing this can greatly simplify the design of system security
requirements.
Lesson Objectives
After completing this lesson, you will be able to:
Note: If you are upgrading applications from SQL Server 2000 or earlier versions, note that
schemas were not used then, and that the naming convention consisted of
Server.Database.Owner.Object. When upgrading databases, SQL Server will automatically create a
schema using the same name as the object owner.
You can assign a default schema to users; this is used when a user refers to an object without specifying a
schema name.
There are a few built-in schemas in SQL Server. The dbo and guest users have associated schemas
attached to their own names. The sys and INFORMATION_SCHEMA schemas are reserved for system
objects that you cannot drop or create objects in.
MCT USE ONLY. STUDENT USE PROHIBITED
3-14 Authorizing Users to Access Resources
Note: Creating users from certificates is an advanced topic that is beyond the scope of this
course.
When locating an object, SQL Server will first check the user’s default schema. If the object is not found,
SQL Server will then check the dbo schema. Therefore, it is important to include schema names when
referring to an object, as shown in the following example:
Locating Objects
SELECT ProductID, Name FROM Production.Product;
Apart from rare situations, using multipart names leads to more reliable code that does not depend on
default schema settings.
Granting Permissions
USE MarketDev;
GO
Demonstration Steps
1. In SQL Server Management Studio, on the File menu, point to Open, and then click File.
2. In the Open File dialog box, navigate to D:\Demofiles\Mod03, click
ConfiguringPermissionsAtSchemaLevel.sql, and then click Open.
3. Execute the code under the heading for Step 1 to change database context.
4. Execute the code under the heading for Step 2 to revoke permission on the
uspGetManagerEmployees stored procedure.
5. Execute the code under the heading for Step 3 to confirm that permission was revoked.
6. Execute the code under the heading for Step 4 to grant EXECUTE permissions on the dbo schema.
7. Execute the code under the heading for Step 5 to try to confirm that permission is now granted on
the schema and the stored procedure in it.
8. Execute the code under the heading for Step 6 to deny permission to execute the stored procedure.
9. Execute the code under the heading for Step 7 to confirm that permission is denied.
10. Execute the code under the heading for Step 8 to change database context.
11. Execute the code under the heading for Step 9 to create a new function.
12. Execute the code under the heading for Step 10 to explore which permissions imply the ability to
select from a schema.
13. Execute the code under the heading for Step 11 to explore which permissions imply the ability to
view the definition of an object.
14. Execute the code under the heading for Step 12 to explore which permissions imply the ability to
select from an object.
15. Execute the code under the heading for Step 13 to drop the user and the login.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Sales schema:
o SalesOrderHeader table
o SalesOrderDetail table
Products schema:
o Product table
o ProductSubcategory table
o ProductCategory table
o vProductCatalog view
o Customer table
Sales managers.
An e-commerce web application that runs as the adventureworks\WebApplicationSvc service
account.
The domain administrator has created the following domain local groups, with the members shown:
ADVENTUREWORKS\Database_Managers:
o ADVENTUREWORKS\IT_Support
ADVENTUREWORKS\InternetSales_Users:
o ADVENTUREWORKS\Sales_Asia
o ADVENTUREWORKS\Sales_Europe
o ADVENTUREWORKS\Sales_NorthAmerica
ADVENTUREWORKS\InternetSales_Managers:
o ADVENTUREWORKS\Sales_Managers
MCT USE ONLY. STUDENT USE PROHIBITED
3-18 Authorizing Users to Access Resources
A SQL Server administrator has created the following server logins and database users in the
InternetSales database, mapped to their relevant Windows accounts:
Database_Managers
InternetSales_Users
InternetSales_Managers
WebApplicationSvc
The e-commerce application must be able to read data from the Products.vProductCatalog view.
The e-commerce application must be able to insert rows into the Sales.SalesOrderHeader and
Sales.SalesOrderDetail tables.
All sales employees and managers must be able to read all data in the Customers table.
Sales managers must be able to insert and update any data in the Sales schema.
All sales employees and managers must be able to read all data in the Sales schema.
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
3. Start SQL Server Management Studio and connect to the MIA-SQL instance of SQL Server.
4. In a new query window, write and execute a script to grant the permissions for the e-commerce
application in the InternetSales database.
5. In the query window, write and execute a script to grant the permissions for the Customer table.
6. Open a command prompt and enter the following command (which opens the sqlcmd utility as
ADVENTUREWORKS\AnthonyFrizzell):
8. In the SQLCMD window, enter the following commands to verify your identity:
SELECT suser_name();
GO
9. In the SQLCMD window, type and execute Transact-SQL statements to verify that Anthony can select
data from the Customer table in the InternetSales database.
2. In SQL Server Management Studio, write and execute a statement to deny the Database_Managers
group SELECT permissions on the Customer table.
3. In the SQLCMD window, type and execute Transact-SQL statements to verify that Anthony cannot
select data from the Customer table in the InternetSales database.
3. In the SQLCMD window, type and execute Transact-SQL statements to verify that Anthony can access
the Customer table through his membership of the Sales_Managers global group, and hence the
InternetSales_Managers local group and SQL Server login.
5. In SQL Server Management Studio, close the query window without saving any changes.
6. Leave SQL Server Management Studio open for the next exercise.
Results: After completing this exercise, you will have assigned the required object-level permissions.
MCT USE ONLY. STUDENT USE PROHIBITED
3-20 Authorizing Users to Access Resources
3. In a new query window, write and execute a script to grant permission for the sales managers to run
the ChangeProductPrice stored procedure.
3. In the SQLCMD window, type and execute Transact-SQL statements to verify that Deanna can run the
stored procedure.
4. Close the SQLCMD window.
5. In SQL Server Management Studio, write and execute a Transact-SQL statement to verify that the
stored procedure updated the price.
6. In SQL Server Management Studio, close the query window without saving any changes.
7. Leave SQL Server Management Studio open for the next exercise.
Results: After completing this exercise, you will have assigned the required EXECUTE permissions on
stored procedures.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 3-21
3. In a new query window, write and execute a script to grant permission for the sales managers to
insert and update data in the Sales schema, and for the sales employees and managers to read data
in the Sales schema.
6. In SQL Server Management Studio, close the query window without saving any changes.
Results: After completing this exercise, you will have assigned the required schema-level permissions.
Question: Your organization needs to track data access by individual Windows users. Does
this mean you cannot base logins on Windows groups?
MCT USE ONLY. STUDENT USE PROHIBITED
3-22 Authorizing Users to Access Resources
Best Practice: Assigning permissions at schema level can simplify your security
architecture.
Review Question(s)
Question: Regarding permissions, how does SQL Server differ from ANSI SQL?
MCT USE ONLY. STUDENT USE PROHIBITED
4-1
Module 4
Protecting Data with Encryption and Auditing
Contents:
Module Overview 4-1
Lesson 1: Options for Auditing Data Access in SQL Server 4-2
Module Overview
When configuring security for your Microsoft® SQL Server® systems, you should ensure that you meet
any of your organization’s compliance requirements for data protection. Organizations often need to
adhere to industry-specific compliance policies, which mandate auditing of all data access. To address this
requirement, SQL Server provides a range of options for implementing auditing.
Another common compliance requirement is the encryption of data to protect against unauthorized
access in the event that access to the database files is compromised. SQL Server supports this requirement
by providing transparent data encryption (TDE). To reduce the risk of information leakage by users with
administrative access to a database, columns containing sensitive data—such as credit card numbers or
national identity numbers—can be encrypted using the Always Encrypted feature.
This module describes the available options for auditing in SQL Server, how to use and manage the SQL
Server Audit feature, and how to implement encryption.
Objectives
After completing this module, you will be able to:
Describe the options for auditing data access.
Lesson 1
Options for Auditing Data Access in SQL Server
SQL Server provides a variety of tools that you can use to audit data access. In general, no one tool meets
all possible auditing requirements and a combination of features is often required.
In this lesson, you will learn about the different auditing options that are available.
Lesson Objectives
At the end of this lesson, you will be able to:
Note: Azure® SQL Database cannot currently be configured to comply with the Common
Criteria.
When the option is enabled, three changes occur to how SQL Server operates:
Residual Information Protection (RIP). Memory is always overwritten with a known bit pattern
before being reused.
Column GRANT does not override table DENY. This changes the default behavior of the
permission system.
Note: The implementation of RIP increases security, but can negatively impact the
performance of the system.
To comply with Common Criteria Evaluation Assurance Level 4+ (EAL4+), in addition to enabling the
common criteria compliance enabled option, you must also download and run a script that makes
further configuration changes to SQL Server. You can download this script from the Microsoft SQL Server
Common Compliance website.
For more information on SQL Server’s compliance with the Common Criteria, see An introduction to the
Common Criteria:
For more information on the common criteria compliance enabled server option, see common criteria
compliance enabled Server Configuration Option in Microsoft Docs:
Data definition language (DDL) triggers. These triggers are associated with DDL statements that
create, alter, or drop database objects.
Note: Azure SQL Database includes support for DML triggers and DDL triggers. Logon
triggers are not supported in Azure SQL Database.
All of these trigger types can play a role in auditing. All trigger types are created using the CREATE
TRIGGER statement. In an auditing context, AFTER triggers are most often used.
DML Triggers
DML triggers are configured to fire when INSERT, UPDATE, and/or DELETE statements run against a table.
You can access the original and new information by using the internal inserted and deleted tables in the
trigger code. When used for auditing, triggers are commonly used to write details of a change to another
table—the table holding the audited copy of the data might be in another database on the same SQL
Server instance.
The following example shows a DML trigger that runs when an update occurs on a row in the
dbo.Employee table. The original data is logged in the dbo.EmployeeSalaryAudit table.
IF UPDATE(Salary) BEGIN
INSERT dbo.EmployeeSalaryAudit (EmployeeID, OldSalary, NewSalary, UpdatedBy,
UpdatedAt)
SELECT i.EmployeeID, d.Salary, i.Salary, SUSER_NAME(), GETDATE()
FROM inserted AS i
JOIN deleted AS d
ON i.EmployeeID = d.EmployeeID;
END;
END;
GO
For more information on DML triggers, see DML Triggers in Microsoft Docs:
DML Triggers
http://aka.ms/ys2yfy
Logon Triggers
A logon trigger fires in response to a login that is authenticated but before a session is established. Logon
triggers can be used to record the logon, either to a table or to the SQL Server error log.
For more information on logon triggers, see Logon Triggers in Microsoft Docs:
Logon Triggers
http://aka.ms/lqyc2g
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-5
DDL Triggers
A DDL trigger fires in response to a DDL event that creates, drops, and/or alters either a specific class of
database object or all database objects. In an auditing context, DDL triggers are commonly used to track
changes to the schema of a database by recording DDL statements to a table.
For more information on DDL triggers, see DDL Triggers in Microsoft Docs:
DDL Triggers
http://aka.ms/qzkssj
Limitations
Triggers have some limitations as an audit tool:
System performance can be significantly impacted by DML triggers running alongside the usual load
on the server.
Users with appropriate permissions can disable triggers. This can be a significant issue for auditing
requirements.
DML triggers cannot be used to audit data access through SELECT statements.
Only limited ability to control trigger firing order is provided. To make sure that it captures all the
changes made by other triggers, auditing would normally need to be the last DML trigger that fires—
which you can only specify by using the sp_settriggerorder system procedure.
For more information on triggers, see CREATE TRIGGER (Transact-SQL) in Microsoft Docs:
Temporal tables cannot be used to audit data access through SELECT statements.
When auditing with temporal tables, you cannot define different actions for INSERT, UPDATE, and
DELETE statements—unlike DML triggers.
The history table used to create a temporal table must be created in the same database as the current
table.
Tracking the identity of the user who made a change requires that you alter the definition of the
table to include a column that defaults to SUSER_SNAME.
For more information on working with temporal tables, see Temporal Tables in Microsoft Docs:
Temporal Tables
http://aka.ms/ft8fc2
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
2. Start SQL Server Management Studio and connect to your Azure instance running the
AdventureWorksLT database, using SQL Server authentication. In the Login box, type Student, in
the Password box, type Pa55w.rd, and then click Connect.
4. Open the Demo 01 - temporal table audit.sql query, and connect to the AdventureWorksLT
database.
5. Execute the code under the heading for Step 2 to create a system-versioned temporary table.
6. Execute the code under the heading for Step 3 to insert some example data.
7. Execute the code under the heading for Step 4 to update a row.
8. Execute the code under the heading for Step 5 to examine the current and history tables that make
up the temporal table.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-7
9. Execute the code under the heading for Step 6 to demonstrate the behavior of the FOR SYSTEM
TIME ALL subclause.
10. Execute the code under the heading for Step 7 to demonstrate the behavior of the FOR SYSTEM
TIME AS OF subclause.
11. Execute the code under the heading for Step 8 to demonstrate that the history table cannot be
edited. Both commands will generate an error.
12. Execute the code under the heading for Step 9 to demonstrate that a user with permission to update
the table directly can insert misleading information.
13. Execute the code under the heading for Step 10 to examine the temporal table again after the
update.
14. When you have finished the demonstration, execute the code under the heading for Step 11 to
remove the demonstration objects.
15. Close SQL Server Management Studio, without saving any changes.
Categorize Activity
Categorize each audit method into the appropriate category. Indicate your answer by writing the
category number to the right of each item.
Items
1 Triggers
3 Temporal Tables
Category 1 Category 2
Lesson 2
Implementing SQL Server Audit
SQL Server includes a purpose-built auditing tool—SQL Server Audit. This lesson covers the architecture of
SQL Server Audit, and how to configure and work with it.
Lesson Objectives
At the end of this lesson, you will be able to:
Create audits.
Explain audit actions and action groups.
Monitor audits using audit-related dynamic management objects and system views.
Extended Events is important because SQL Server Audit is based on the Extended Events infrastructure.
The Extended Events engine is not tied to particular types of events—the engine is written in such a way
that it can process any type of event.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-9
Additional Reading: Because Extended Events is the basis for SQL Server Audit, you could
opt to write your own auditing system based on Extended Events. See Module 12 of this course
Tracing Access to SQL Server with Extended Events for more information on working with
Extended Events.
Executables and executable modules can expose one or more Extended Events packages at run time.
Packages act as containers for the Extended Events objects and their definitions; a package may expose
any of the following object types:
Types and Maps. Reference data; data type and lookup value definitions.
Sessions. Links events and actions, filtered by predicates, to one or more targets. Typically, sessions
are user-defined.
SQL Server Audit is a special package within Extended Events; you cannot change its internal
configuration.
For more information on Extended Events, see Extended Events in Microsoft Docs:
Extended Events
http://aka.ms/b8p2e9
Object Description
Server Audit Specification Collects many server-level action groups raised by Extended Events.
One per audit.
Database Audit Specification Collects database-level audit actions raised by Extended Events. One
per database per audit.
MCT USE ONLY. STUDENT USE PROHIBITED
4-10 Protecting Data with Encryption and Auditing
Object Description
Actions Specific actions that can raise events and be added to the audit. For
example, SELECT operations on a table.
Target Receives and stores the results of the audit. Can be a file, Windows
Security event log, or Windows Application event log.
For more information about SQL Server Audit, see SQL Server Audit (Database Engine) in Microsoft Docs:
On failure. Action to take if the audit log is unavailable—continue, shut down server, or fail auditable
operations.
Maximum file size. Maximum size of each audit file (in MB).
Reserve disk space. Indicates whether to reserve disk space for audit files in advance.
Note: The value you configure for the queue delay needs to be a tradeoff between security
and performance. A low value ensures that events are logged quickly and avoids the risk of losing
items from the audit trail in the event of failure, but can result in a significant performance
overhead.
Audit Targets
Audits can be sent to one of the following three targets:
Binary file. File output provides the highest performance and is the easiest option to configure.
Windows Application Event Log. Avoid sending too much detail to this log as network
administrators tend to dislike applications that write too much content to any of the event logs. Do
not use this target for sensitive data because any authenticated user can view the log.
Windows Security Event Log. This is the most secure option for auditing data, but you need to add
the SQL Server service account to the Generate Security Audits policy before using it.
You should review the contents of the target that you use and archive its contents periodically.
The following code example creates and enables a server audit that uses a binary file as the target:
Note: The filename that you provide to the FILEPATH parameter when creating a server
audit is actually a path to a folder. SQL Server generates log files automatically and stores them in
this location.
For more information on the CREATE SERVER AUDIT command, see CREATE SERVER AUDIT (Transact-
SQL) in Microsoft Docs:
CREATE SERVER AUDIT (Transact-SQL)
http://aka.ms/mn06tw
Actions and action groups are linked to an audit through an audit specification.
MCT USE ONLY. STUDENT USE PROHIBITED
4-12 Protecting Data with Encryption and Auditing
Predefined action groups are available at server level, database level, and audit level. Actions are only
available at database level.
For a full list of available server actions and action groups, see SQL Server Audit Action Groups and Actions
in Microsoft Docs:
For a full list of available server action groups, see Server-Level Audit Action Groups in SQL Server Audit
Action Groups and Actions in Microsoft Docs:
SQL Server Audit Action Groups and Actions - Server-Level Audit Action Groups
http://aka.ms/bak8rw
The following example creates and enables a server audit specification to track failed and successful login
attempts. This code assumes that the server audit SecurityAudit has already been created.
For more information on the CREATE SERVER AUDIT SPECIFICATION command, see CREATE SERVER
AUDIT SPECIFICATION (Transact-SQL) in Microsoft Docs:
SQL Server Audit Action Groups and Actions - Database-Level Audit Action Groups and
Database-Level Audit Actions
http://aka.ms/bak8rw
The following example creates an audit specification that includes all database principal changes and all
SELECT queries on objects in the salesapp1 schema by members of the db_datareader fixed database-
level role. This code assumes that the server audit SecurityAudit has already been created.
For more information on the CREATE DATABASE AUDIT SPECIFICATION command, see CREATE
DATABASE AUDIT SPECIFICATION (Transact-SQL) in Microsoft Docs:
MCT USE ONLY. STUDENT USE PROHIBITED
4-14 Protecting Data with Encryption and Auditing
Audit DMVs
The following DMVs return metadata about audits:
sys.dm_audit_class_type_map. Returns a
reference data list mapping audit class codes to
descriptions.
sys.dm_server_audit_status. Returns a list of all the audits defined on an instance of SQL Server, and
their current status.
For more information on audit-related DMVs, see the links in Security-Related Dynamic Management
Views and Functions (Transact-SQL) in Microsoft Docs:
sys.server_file_audits. Returns a list of all audits that write data to a file target.
For more information on audit-related system views, see the links in SQL Server Audit Views section of
Security Catalog Views (Transact-SQL) in Microsoft Docs:
Plain SQL
Parameterized SQL
Stored Procedure
Login
Transaction Management
You may choose to audit success, failure, or both success and failure of each event category.
Audit data is held in a collection of Store Tables in a storage account that you select.
Auditing in Azure SQL Database is available for all service tiers (Basic, Standard, and Premium).
For more information and instructions on how to configure auditing in Azure SQL Database, see Get
started with SQL database auditing in the Azure documentation:
Get started with SQL database auditing
http://aka.ms/bnbvzb
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
3. In the User Account Control dialog box, click Yes, and then wait for the script to complete.
4. Start SQL Server Management Studio and connect to MIA-SQL using Windows authentication.
7. Execute the code under the heading for Step 1 to create a new audit.
8. Execute the code under the heading for Step 2 to enable the new audit.
9. Execute the code under the heading for Step 3 to add a server audit specification to the new audit.
10. Execute the code under the heading for Step 4 to add a database audit specification to the new
audit.
11. Execute the code under the heading for Step 5 to alter the database audit specification by adding an
additional action group.
12. Execute the code under the heading for Step 6 to examine the audit metadata.
13. Execute the code under the heading for Step 7 to examine the server audit specification metadata.
14. Execute the code under the heading for Step 8 to examine the database audit specification metadata.
15. Execute the code under the heading for Step 9 to remove the audit and specifications created for this
demonstration.
16. Leave SQL Server Management Studio open for the next demonstration.
The following code shows an example of how to call the sp_audit_write stored procedure from an insert
trigger:
Calling sp_audit_write
CREATE TRIGGER HR.BonusChecker ON HR.EmployeeBonus
AFTER INSERT, UPDATE
AS
Note: You must ensure that all principals who may trigger custom audit actions have been
granted EXECUTE permission on the sys.sp_audit_write stored procedure in the master
database. The easiest way to ensure this is to grant EXECUTE permission on sys.sp_audit_write to
public.
sp_audit_write (Transact-SQL)
http://aka.ms/eeq2c4
Demonstration Steps
1. In Solution Explorer, open the Demo 03 - custom audit.sql query.
2. Execute the code under the heading for Step 1 to create a new audit.
3. Execute the code under the heading for Step 2 to create a server audit specification including the
USER_DEFINED_AUDIT_GROUP action group.
4. Execute the code under the heading for Step 3 to call sp_audit_write directly.
5. Execute the code under the heading for Step 4 to demonstrate how the custom event appears in the
audit log file.
6. Execute the code under the heading for Step 5 to create a stored procedure that uses
sp_audit_write. The stored procedure will log a custom audit event if the discount applied to a row
in the Sales.OrderDetails table is greater than 30 percent.
7. Execute the code under the heading for Step 6 to call the new stored procedure twice. The second
call should cause a custom audit event to be logged because the discount applied is 45 percent.
8. Execute the code under the heading for Step 7 to examine how the custom event was recorded in
the audit log file.
9. Execute the code under the heading for Step 8 to drop the demonstration audit objects.
10. Leave SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
4-18 Protecting Data with Encryption and Auditing
Target
Target Group
Server Audit
Lesson 3
Managing SQL Server Audit
After you have configured SQL Server Audit, you need to know how to work with audit data. This lesson
covers the different ways you can access audit data, and how to work with the audit record format. It also
covers some potential issues you may encounter when working with servers and databases that have SQL
Server Audit enabled.
Lesson Objectives
At the end of this lesson, you will be able to:
Explain potential issues you might encounter when using SQL Server Audit.
The sys.fn_get_audit_file function takes three parameters—the file pattern, the initial file name, and the
audit record offset. The file pattern can be in one of three formats:
<path>\* that collects audit files in the specified location. The asterisk character is a wildcard.
<path>\<audit name>_{GUID} that collects all audit files that have the specified name and GUID
pair.
This example shows how to use sys.fn_get_audit_file to read all the audit files in a specific directory:
sys.fn_get_audit_file—basic usage
SELECT *
FROM sys.fn_get_audit_file('X:\AuditFiles\*',default,default);
Disabling an Audit
USE master;
ALTER SERVER AUDIT SecurityAudit
WITH (STATE = OFF);
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-21
Each audit is identified by a GUID. If you restore or attach a database on a server, SQL Server attempts
to match the GUID in the database with the GUID of the audit on the server. If no match occurs,
auditing will not work until you correct the issue by executing the CREATE SERVER AUDIT command
to set the appropriate GUID using the AUDIT_GUID option.
Mirrored servers introduce a similar issue of mismatched GUIDs. The mirror partner must have a
server audit with the same GUID. You can create this by using the CREATE SERVER AUDIT command
and supplying the GUID value to match the one on the primary server.
If databases are attached to editions of SQL Server that do not support the same level of audit
capability, the attach works but the audit is ignored.
You should consider the performance impact of audit writes and whether you need to minimize your
audit list to maximize performance.
If insufficient disk space is available to hold audit files and you have configured an audit to shut down
the server on failure, SQL Server might not start. In this situation, you may need to force entry to it by
starting SQL Server in minimal configuration mode with the -f startup parameter.
For more information on specifying a GUID when you create a server audit, see CREATE SERVER AUDIT
(Transact-SQL) in Microsoft Docs:
How to view the output of an audit with a Windows event log target.
Demonstration Steps
1. In Solution Explorer, open the Demo 04 - audit output.sql query.
2. Execute the code under the heading for Step 1 to create an audit with a file target.
3. Execute the code under the heading for Step 2 to create an audit with a Windows application log
target.
4. Execute the code under the heading for Step 3 to add a specification to each audit that collects
SELECT statements run against the salesapp1.Sales schema.
5. Execute the code under the heading for Step 4 to execute a select statement that will be audited.
6. Execute the code under the heading for Step 5 to examine the contents of the audit file target. Point
out the most useful fields.
8. In Event Viewer, expand the Windows Logs node, then click Application. The audit entry will be the
most recent one in the Application pane with a Source value of MSSQLSERVER. Demonstrate the
entry, then close Event Viewer.
9. Execute the code under the heading for Step 7 to remove the demonstration audit objects.
MCT USE ONLY. STUDENT USE PROHIBITED
4-22 Protecting Data with Encryption and Auditing
10. Leave SQL Server Management Studio open for the next demonstration.
File
Ring Buffer
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-23
Lesson 4
Protecting Data with Encryption
Many organizations are obliged by their security compliance policies to protect data at rest by encrypting
it, to mitigate the risk of the physical theft of data storage media, such as disks and backup tapes. SQL
Server includes a method of encrypting data at rest—Transparent Data Encryption (TDE).
To protect the most sensitive data—for example, credit card numbers, or national identity numbers—from
unauthorized access, SQL Server offers Always Encrypted, which allows data values to be encrypted by an
application before they are inserted into a database. TDE can use enterprise encryption key management
systems through the Extensible Key Management feature. You can also use Dynamic Data Masking to
obfuscate or conceal sensitive data from users who are not authorized to see it.
Lesson Objectives
At the end of this lesson, you will be able to:
Explain Transparent Data Encryption.
Note: TDE protects data at rest in database files. Data pages in the buffer pool or returned
to client applications are not encrypted by TDE.
Database master key (DMK). The DMK for the master database is used to generate a certificate in
the master database. SQL Server uses the SMK and a password that you specify to generate the DMK,
and stores it in the master database.
Note: You can use a password without the SMK to generate a DMK, although this is less secure.
Server certificate. A server certificate is generated in the master database, and is used to encrypt an
encryption key in each TDE-enabled database.
Database encryption key (DEK). A DEK in the user database is used to encrypt the entire database.
Note: When a database is configured to use TDE, CPU utilization for SQL Server may
increase due to the overhead of encrypting and decrypting data pages.
Enabling TDE
TDE is only available for production use in SQL Server Enterprise edition. To enable TDE, you must
perform the following steps:
Additional Reading: TDE is available in Azure SQL Database; for more information, see
Encryption with Azure SQL Database later in this lesson.
For more information about TDE, see Transparent Data Encryption (TDE) in Microsoft Docs:
2. Copy or move the database files to the same location on the destination server.
3. Create a service master key in the master database on the destination server.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-25
4. Use a CREATE CERTIFICATE Transact-SQL statement to generate a server certificate on the destination
server from the backup of the original server certificate and its private key.
For more information on moving a TDE encrypted database, see Move a TDE Protected Database to
Another SQL Server in Microsoft Docs:
EKM is only available for production use in SQL Server Enterprise edition.
For general information about managing encryption keys in SQL Server, see SQL Server and Database
Encryption Keys (Database Engine) in Microsoft Docs:
SQL Server and Database Encryption Keys (Database Engine)
http://aka.ms/phwy4p
To enable EKM support, you must enable the server-level EKM provider enabled option, and then create
credentials to allow SQL Server to access the EKM provider.
For information about configuring EKM support, see Extensible Key Management (EKM) in Microsoft Docs:
Extensible Key Management (EKM)
http://aka.ms/dmhvxl
For information on using Azure Key Vault as an EKM provider for SQL Server, see Extensible Key
Management Using Azure Key Vault (SQL Server) in Microsoft Docs:
Always Encrypted
The Always Encrypted feature allows data to be
transparently encrypted using a special database
driver, without the encryption keys being accessible
in the database. This means you can store sensitive
data in databases over which you do not have
complete administrative control—for example,
database instances hosted by cloud services—or
where data is so sensitive that it should not be
accessible to SQL Server administrators. Always
Encrypted is unlike TDE in the following significant
ways:
Data is encrypted both at rest and in motion.
Encryption and decryption take place at the client application.
Application on-premises, database in the cloud. In this scenario, Always Encrypted might be used to
protect sensitive data from accidental or malicious access by cloud service administrators.
Encryption Types
Two types of encryption are supported by Always Encrypted:
Deterministic encryption. A given plain-text value will always give the same cypher-text value. This
allows filtering and grouping by ranges of encrypted values, but may allow an attacker to guess
column values by analyzing patterns in the encrypted values.
Randomized encryption. The cypher-text value cannot be predicted based on the plain-text value.
This form of encryption is more secure, but the column value cannot be used in filters or grouping
expressions.
Column master keys. As with master keys used in TDE, column master keys are used to create and
protect column encryption keys. Column master keys must be stored in a trusted key store.
Column encryption keys. Used to encrypt column data. Column encryption keys—encrypted with a
column master key—are securely stored in the database.
2. Retrieve the relevant column master key from the trusted key store.
3. Use the column master key to decrypt the column encryption key.
4. Use the decrypted column encryption key to decrypt the column data.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-27
Restrictions
There are many limitations on the use of Always Encrypted, including:
Always Encrypted may not be used on columns of any of the following data types: xml, rowversion,
image, ntext, text, sql_variant, hierarchyid, geography, geometry, aliased types (such as
sysname), and user defined-types.
Columns of any string data type (char, varchar, nvarchar, and so on) must use a _BIN2 collation to
be eligible for Always Encrypted.
For more information on Always Encrypted, including a full list of restrictions, see Always Encrypted
(Database Engine) in Microsoft Docs:
Mask Formats
Four separate data masks are available for different
use cases:
Default. The data is fully masked. The mask value will vary based on the data type of the masked
column.
Email. For string data types only. The first letter of the data is exposed; the remainder of the data is
masked with “X”, and the value has the constant suffix “.com”—regardless of the actual top-level
domain of the masked email addresses.
Custom String. For string data types only. One or more letters at the start and end of the string are
exposed. The remainder of the data is masked with a mask you can define.
Random. For numeric types only. The original value is masked with a random value from within a
range you specify.
Note: A user without the UNMASK permission, but with the UPDATE permission, will be
able to update a masked column.
MCT USE ONLY. STUDENT USE PROHIBITED
4-28 Protecting Data with Encryption and Auditing
Restrictions
A mask cannot be specified on columns that meet the following criteria:
FILESTREAM columns
COLUMN_SET columns
Calculated columns
The following example demonstrates how to define a column masked with the default mask as part of a
CREATE TABLE statement:
For more information on dynamic data masking, see Dynamic Data Masking in Microsoft Docs:
TDE
TDE is supported on Azure SQL Database. It can be
configured using Transact-SQL, as discussed earlier
in this lesson, or it can be configured using the
Azure Portal, or by using Azure PowerShell. When
you use TDE on Azure SQL Database, you just mark
a database as encrypted. Encryption keys and
certificates are managed by Microsoft.
Transparent Data Encryption for Azure SQL Database and Data Warehouse
http://aka.ms/edq1g5
EKM
EKM is not supported in Azure SQL Database, but you can use the Azure Key Vault service as an EKM
provider to protect your encryption keys.
Always Encrypted
Always Encrypted is supported by Azure SQL Database. It is configured in the same way as for a SQL
Server instance.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 4-29
Demonstration Steps
1. In Solution Explorer, open the Demo 05 - masking.sql query.
2. Execute the code under the heading for Step 1 to create a new table with data masked data, grant
permission to a test user, and insert test data.
3. Execute the code under the heading for Step 2 to demonstrate that an administrator can see
unmasked data.
4. Execute the code under the heading for Step 3 to demonstrate that a user with only SELECT
permission sees the masked data. Spend some time comparing the masked output to the table
definitions.
5. Execute the code under the heading for Step 4 to add a mask to the home_phone_number column.
6. Execute the code under the heading for Step 5 to demonstrate the new mask.
7. Execute the code under the heading for Step 6 to remove the mask from the salary column.
8. Execute the code under the heading for Step 7 to demonstrate that the mask on salary is no longer
in place.
9. Execute the code under the heading for Step 8 to grant the UNMASK permission to the test user.
Note that it is a database-level permission.
10. Execute the code under the heading for Step 9 to demonstrate the effect of the UNMASK permission.
11. Execute the code under the heading for Step 10 to drop the demonstration table.
12. Close SQL Server Management Studio, without saving any changes.
MCT USE ONLY. STUDENT USE PROHIBITED
4-30 Protecting Data with Encryption and Auditing
Categorize Activity
Categorize each item by the corresponding SQL Server feature. Indicate your answer by writing the
category number to the right of each item.
Items
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
o Name: activity_audit
o On failure: continue
o Target: file
o Name: audit_logins
o Audit: activity_audit
o Action groups: SUCCESSFUL_LOGIN_GROUP
o Name: employees_change_audit
o Audit: activity_audit
o Actions:
1. Encrypt a Column
2. View Always Encrypted Data from an Application
2. Run a SELECT statement against the salesapp1.Sales.Customers table to view the encrypted data.
2. Execute the code under the heading for Task 1 to create a service master key.
2. Under the heading for Task 4, edit the query to back up the TDE_cert server certificate to
D:\Labfiles\Lab04\Starter\Audit\TDE_cert.bak, and to back up the TDE_cert private key to
D:\Labfiles\Lab04\Starter\Audit\TDE_cert_pk.bak, using the password
*R8vkULA5aKhp3ekGg1o3.
Task 4: Create a Database Encryption Key and Encrypt the salesapp1 Database
1. Under the heading for Task 5, edit the query to create a database encryption key using the AES_256
algorithm, encrypted by the server certificate TDE_cert.
2. Execute the code under the heading for Task 6 to encrypt the salesapp1 database.
3. Execute the code under the heading for Task 7 to examine the sys.dm_database_encryption_keys
DMV. Notice that both salesapp1 and tempdb have been encrypted—tempdb is encrypted if any
database on the instance is encrypted.
4. Create a certificate named TDE_cert in the master database on the MIA-SQL\SQL2 instance from the
backup certificate and private key files you created previously.
5. Attach the salesapp1 database to the MIA-SQL\SQL2 instance. Verify that you can access the data it
contains.
Deterministic encryption
Randomized encryption
Choose the option to shut down SQL Server on audit failure. There is usually no point in setting
up auditing, and then having situations where events can occur but are not audited. This is
particularly important in high-security environments.
Make sure that file audits are placed on drives with large amounts of free disk space and ensure
that the available disk space is monitored on a regular basis.
Best Practice: When planning to implement database encryption, consider the following
best practices:
Use a complex password to protect the database master key for the master database.
Ensure you back up certificates and private keys used to implement TDE, and store the backup
files in a secure location.
If you need to implement data encryption on multiple servers in a large organization, consider
using an EKM solution to manage encryption keys.
If you intend to use Always Encrypted, plan how you will store column master keys to make them
accessible to applications that need to encrypt and decrypt data.
Review Question(s)
Question: You may wish to audit actions by a DBA. How would you know if the DBA
stopped the audit while performing covert actions?
MCT USE ONLY. STUDENT USE PROHIBITED
5-1
Module 5
Recovery Models and Backup Strategies
Contents:
Module Overview 5-1
Lesson 1: Understanding Backup Strategies 5-2
Module Overview
One of the most important aspects of a database administrator's role is ensuring that organizational data
is reliably backed up so that, if a failure occurs, you can recover the data. Even though the computing
industry has known about the need for reliable backup strategies for decades—and discussed this at great
length—unfortunate stories regarding data loss are still commonplace. A further problem is that, even
when the strategies in place work as they were designed, the outcomes still regularly fail to meet an
organization’s operational requirements.
In this module, you will consider how to create a strategy that is aligned with organizational needs, based
on the available backup models, and the role of the transaction logs in maintaining database consistency.
Objectives
After completing this module, you will be able to:
Lesson 1
Understanding Backup Strategies
SQL Server supports three database recovery models. All models preserve data in the event of a disaster,
but there are important differences that you need to consider when selecting a model for your database.
Choosing the appropriate recovery model is an important part of any backup strategy. The recovery
model that you select for your database will determine many factors, including:
Lesson Objectives
After completing this lesson, you will be able to:
Key Criteria
When designing a backup strategy, there is always
a tradeoff between the level of safety that is
guaranteed and the cost of the solution. If you ask any business about how much data they can afford to
lose, you will almost certainly be told that they cannot afford to lose any data, in any circumstances. Yet,
while no data loss is an admirable goal, it is not affordable or realistic. For this reason, there are two
objectives that need to be established when discussing a backup strategy: a recovery time objective (RTO)
and a recovery point objective (RPO). Part of the strategy might also involve the retrieval of data from
other locations where copies of the data are stored.
Organizations often deploy large numbers of databases. The RPO and RTO for each database might be
different. This means that database administrators will often need to work with different backup strategies
for different databases that they are managing. Most large organizations have a method of categorizing
the databases and applications, in terms of importance to the core functions of the organization. The
business requirements will determine all aspects of the backup strategy, including how frequently backups
need to occur; how much data is to be backed up each time; the type of media that the backups will be
held on; and the retention and archival plans for the media.
Note: Compressed SQL Server backups cannot share media with other types of backup.
By using logical backup devices, an application can be designed to always send backups to the logical
backup device, instead of to a specific physical location.
For example, an HR application could be designed to create backups on a logical backup device named
HRBackupDevice. A database administrator could later determine where the backups should be physically
sent. The administrator could then decide the name of a file that should hold the backups from the HR
application. No changes would need to be made to the HR application to accommodate the change in
backup file location, as the application would always back up to the same logical backup device.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-5
With a mirrored backup (only available in Enterprise edition), the same backup data is written to each
backup device concurrently. This option provides for redundancy of the physical backup device. Only one
of the devices needs to be present during a restore process.
Note: While mirrored backups help provide fault tolerance regarding media failure after
the backup completes, mirrored backups are actually a fault-intolerant option during the backup
process. If SQL Server cannot write to one of the mirrored devices, the entire backup fails.
Striping of backups causes a single backup to be written across a set of backup devices. Each backup
device receives only part of the backup. All backup devices need to be present when a restore of the data
is required.
Required Privileges
To perform a backup, you must be a member of the sysadmin fixed server role or the db_owner or
db_backupoperator fixed database roles.
Your organization will depend on the quality of backups if they need to be restored. The more copies
of backups that you have, and the more pieces of media that are holding all the required data, the
better the chance there is of being able to recover.
The act of creating a backup over your most recent backup is generally regarded as the worst
offence. If the system fails during the backup, you will often lose both your data and the backup.
Consider the example of a database administrator who asked for help on a Microsoft SQL Server
forum. The DBA had inadvertently performed a restore operation instead of a backup operation—and
the last backup was performed a year ago. Unfortunately, there was little that anyone could do to
help recover the situation.
Avoidance strategy: make multiple copies of backups.
MCT USE ONLY. STUDENT USE PROHIBITED
5-6 Recovery Models and Backup Strategies
Company A performed regular backups, but recovery had not been tested. The first time that a real
recovery was attempted, it was discovered that not all files that needed to be backed up were in fact
backed up.
Avoidance strategy: regular reconstruction of data from backup recovery testing.
Unreadable Backups
Company B performed regular backups but did not test them. When recovery was attempted, none
of the backups were readable. This situation is often caused by hardware failures, in addition to the
inappropriate storage of media.
Avoidance strategy: regular backup recovery testing.
Unavailable Hardware
Company C purchased a special tape drive to perform their backups. When they decided to restore
the backups, that special tape drive no longer worked and no other device within the organization
could read the backups—even if the backups were valid.
Avoidance strategy: regular backup recovery testing.
Old Hardware
Company D performed regular backups and retained them for an appropriate period. When the
company wanted to restore the backups, they no longer possessed equipment that was capable of
restoring them.
Avoidance strategy: regular backup recovery testing, combined with recovery and backup onto
current devices.
Misaligned Hardware
Company E performed regular backups and even tested that they could perform restore operations
from the backups. However, because they tested the restores on the same device that performed the
backups, they did not realize that the device was misaligned—and that this was the only device that
could read those backups. When a restore was needed, the device that the backups were performed
on failed.
Avoidance strategy: regular backup recovery testing on a separate system and a separate physical
device.
General Considerations
When multiple types of backups are being performed, it is important to work out the combination of
backups that will be needed when a restore is required.
Organizations might need to fulfill legal or compliance requirements regarding the retention of backups.
In most cases, full database backups are kept for a longer period of time than other backup types.
Checking the consistency of databases by using DBCC CHECKDB is a crucial part of database
maintenance, and is discussed later in this course.
You will need to determine, not only how long backups should be kept, but also where they are kept. Part
of the RTO needs to consider how long it takes to obtain the physical backup media, if it needs to be
restored.
You also need to make sure that backups are complete. Are all files that are needed to recover the system
(including external operating system files) being backed up?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-7
SQL Server backup to Azure Blob storage is sometimes referred to as SQL Server Backup to URL.
Implementing SQL Server Backup to Azure Blob storage is as simple as setting up an Azure storage
account, configuring the container, and specifying a URL as the backup destination when you back up
your database.
Demonstration Steps
Install the Azure PowerShell Module
1. On start menu, type Windows PowerShell. Then right-click Windows PowerShell™, and click Run
ISE as Administrator.
3. At the command prompt, type Install-Module AzureRM -AllowClobber, and then press Enter.
4. If the NuGet provider is required to continue is displayed, type Y, and then press enter.
5. If the Untrusted repository message is displayed, type A, and then press enter.
6. Wait until the installation completes, and then close the PowerShell window.
1. Run Setup.cmd in the D:\Demofiles\Mod05 folder as Administrator. In the User Account Control
dialog box, click Yes.
2. On the taskbar, click Internet Explorer, and go to portal.azure.com.
3. Log into your Azure account, in the left blade, click Storage accounts.
d. Performance: Standard
1. On the taskbar, right-click the Windows PowerShell icon, and then click Windows PowerShell ISE.
3. In the Open dialog box, go to D:\Demofiles\Mod05, click ContainerSAS.ps1, and then click Open.
4. Amend the $accountName to the Microsoft account that is associated with your Azure pass.
5. Amend the $storageAccountName to the name of the Storage Account you created in the previous
task.
8. In the Sign in to your account dialog box, enter your Microsoft Azure account user name and
password, and then click Sign in.
9. In the Confirm dialog box, click Yes to delete the account (don’t worry—this doesn’t actually delete
the account).
Note: Leave the window open—you will need the information displayed in the next task.
1. Start SQL Server Management Studio, and connect to the MIA-SQL instance.
5. Amend the sv=enter key here entry to your Shared Access Signature in the PowerShell window
6. In SQL Server Management Studio, highlight the statement under the comment Create credential,
and then click Execute.
7. Highlight the statements underneath the comment Backup the database, click Execute, and wait for
the backup process to complete successfully.
8. In Internet Explorer, on the All resources blade, click Refresh, and then click the name of your
Storage account (classic).
9. On your account blade, under BLOB SERVICE, click Containers, and then click aw2016.
10. Verify that the logtest.bak backup file has been created.
Question: What are the advantages of using SQL Server Backup with Azure Blob storage?
MCT USE ONLY. STUDENT USE PROHIBITED
5-10 Recovery Models and Backup Strategies
Lesson 2
SQL Server Transaction Logs
Before you can plan a backup strategy, you must understand how SQL Server uses the transaction log to
maintain data consistency, and how the recovery model of the database affects transaction log
operations, in addition to the available backup options.
Lesson Objectives
After completing this lesson, you will be able to:
Write-Ahead Logging
When SQL Server needs to modify the data in a database page, it first checks if the page is present in the
buffer cache. If the page is not present, it is read into the buffer cache. SQL Server then modifies the page
in memory, writing redo and undo information to the transaction log. While this write is occurring, the
“dirty” page in memory is locked until the write to the transaction log is complete. At regular intervals, a
background checkpoint process flushes the dirty pages to the database, writing all the modified data to
disk.
This process is known as write-ahead logging (WAL) because all log records are written to the log before
the affected dirty pages are written to the data files and the transaction is committed. The WAL protocol
ensures that the database can always be set to a consistent state after a failure. This recovery process will
be discussed in detail later in this course—its effect is that transactions that were committed before the
failure occurred are guaranteed to be applied to the database. Those transactions that were “in flight” at
the time of the failure, where work is partially complete, are undone.
Writing all changes to the log file in advance also makes it possible to roll back transactions if required.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-11
Transaction Rollback
SQL Server can use the information in the transaction log to roll back transactions that have only been
partially completed. This ensures that transactions are not left in a partially-completed state. A transaction
rollback may occur because of a request from a user or client application (such as the execution of a
ROLLBACK TRANSACTION statement) or because a transaction is in a partially-completed state at the time
of a system failure.
Between 64 MB and 1 8
GB
Greater than 1 GB 16
When a log file write reaches the end of the existing log file, SQL Server starts writing again at the
beginning, overwriting the log records currently stored. This mechanism works well, providing that the
previous log records in that section of the log file have already been written to the database and freed up,
or “truncated”. If they have not been truncated and the data is required, SQL Server tries to grow the size
of the log file. If this is not possible (for example, if it is not configured for automatic growth or the disk is
full) SQL Server fails the transaction and returns an error. If it is possible to grow the log file, SQL Server
allocates new virtual log files, using the auto growth size increment in the log file configuration.
Note: Instant File Initialization (IFI) cannot be used with transaction log files. This means
that transactions can be blocked while log file growth occurs.
MCT USE ONLY. STUDENT USE PROHIBITED
5-12 Recovery Models and Backup Strategies
Note: Replication is beyond the scope of this course but it is important to be aware that
the configuration and state of replicated data can affect transaction log truncation.
In earlier versions of SQL Server, simple recovery model was referred to as “truncate log on checkpoint”.
The name was changed to provide a focus on the recovery options, rather than on the process involved in
implementing the option. Each time a checkpoint process occurs, SQL Server will automatically truncate
the transaction log up to the end of the VLF, before the VLF that contains the MinLSN value. This means
that the only role that the transaction log ever plays is the provision of active transaction log data during
the recovery of a database.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-13
Note: One potentially surprising outcome is that the log backups can often be larger than
the transaction logs. This is because SQL Server retrieves the modified extents from the data files
while performing a log backup for minimally-logged data.
Another common misconception is that the log file of a database in simple recovery model will not grow.
This is also not the case. In simple recovery model, the transaction log needs to be large enough to hold
all details from the oldest active transaction. Large or long-running transactions can cause the log file to
need additional space.
MCT USE ONLY. STUDENT USE PROHIBITED
5-14 Recovery Models and Backup Strategies
Note: Database mirroring, transactional replication, and change data capture are beyond
the scope of this course.
You can use the log_reuse_wait_desc column in the sys.databases table to identify the reason why you
cannot truncate a log.
The values that can be returned for the log_reuse_wait_desc column include:
0 = Nothing
1 = Checkpoint
2 = Log backup
5 = Database mirroring
6 = Replication
8 = Log scan
9 = Other (transient)
After resolving the reason that is shown, perform a log backup (if you are using full recovery model) to
truncate the log file, and then you can use DBCC SHRINKFILE to reduce the file size of the log file.
Note: If the log file does not reduce in size when using DBCC SHRINKFILE as part of the
above steps, the active part of the log file must have been at the end at that point in time.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-15
Manual checkpoints are issued when you execute a Transact-SQL CHECKPOINT command. The
manual checkpoint occurs in the current database for your connection. By default, manual
checkpoints run to completion. The optional checkpoint duration parameter specifies a requested
amount of time, in seconds, for the checkpoint to complete.
Internal checkpoints are issued by various server operations, such as backup and database snapshot
creation, to guarantee that disk images match the current state of the log.
You can configure the target duration of a checkpoint operation by executing the CHECKPOINT
statement.
Demonstration Steps
1. In SQL Server Management Studio, in Object Explorer, expand Databases, right-click LogTest, and
then click Properties.
2. In the Database Properties - LogTest dialog box, on the Options page, verify that the Recovery
model is set to Full, and then click Cancel.
4. In the Open File dialog box, go to D:\Demofiles\Mod05, click LogComparisonTest.sql, and then
click Open.
5. Select the code under the comment Perform a full database backup, and then click Execute.
6. Select the code under the comment View log file space, and then click Execute. Note the log size
and space used in the LogTest log.
MCT USE ONLY. STUDENT USE PROHIBITED
5-16 Recovery Models and Backup Strategies
7. Select the code under the comment Insert data, and then click Execute to insert 10000 rows.
8. Select the code under the comment View log file space, and then click Execute. Note that the log
size and space used in the LogTest log file has increased.
9. Select the code under the comment Issue checkpoint, and then click Execute to force SQL Server to
perform a checkpoint and flush the modified pages to disk.
10. Select the code under the comment View log file space, and then click Execute. Note the space
used in the LogTest log file has not decreased.
11. Select the code under the comment Check log status, and then click Execute. Note that SQL Server
is awaiting a log backup before the log file can be truncated.
12. Select the code under the comment Perform a log backup, and then click Execute.
13. Select the code under the comment Verify log file truncation, and then click Execute. Note the
space used in the LogTest log file has decreased because the log has been truncated.
Lesson 3
Planning Backup Strategies
To effectively plan a backup strategy, you should align your chosen combination of backup types to your
business recovery requirements. Most organizations will need to use a combination of backup types rather
than relying solely on just one.
Lesson Objectives
After completing this lesson, you will be able to:
Full Backups
A full backup of a database includes the data files
and the active part of the transaction log. The first
step in the backup is that a CHECKPOINT
operation is performed. The active part of the
transaction log includes all details from the oldest active transaction forward. A full backup represents the
database at the time that the data reading phase of the backup was completed—this serves as your
baseline in the event of a system failure. Full backups do not truncate the transaction log.
Differential Backups
A differential backup is used to save the data that has been changed since the last full backup. Differential
backups are based on the data file contents, rather than on log file contents, and contain extents that
have been modified since the last full database backup. Differential backups are generally faster to restore
than transaction log backups, but they have less options available. For example, point-in-time recovery is
not available unless differential backups are also combined with log file backups.
Partial Backups
A partial backup is similar to a full backup, but does not contain all of the filegroups. Partial backups
contain all the data in the primary filegroup, every read/write filegroup, and any specified read-only files.
A partial backup of a read-only database contains only the primary filegroup.
MCT USE ONLY. STUDENT USE PROHIBITED
5-18 Recovery Models and Backup Strategies
Note: Working with partial backups is an advanced topic that is beyond the scope of this
course.
Tail-log Backups
A transaction log backup that is taken just before a restore operation is called a tail-log backup. Typically,
tail-log backups are taken after a disk failure that affects data files only. From SQL Server 2005 onwards,
SQL Server has required that you take a tail-log backup before it will allow you to restore a database—to
protect against inadvertent data loss.
Also, tail-log backups are often possible even when the data files from the database are no longer
accessible.
Note: Working with file and filegroup backups is an advanced topic that is beyond the
scope of this course.
Copy-only Backups
SQL Server supports the creation of copy-only backups. Unlike other backups, a copy-only backup does
not impact the overall backup and restore procedures for the database. Copy-only backups can be used
to create a copy of the backup to take offsite to a safe location. Copy-only backups are also useful when
performing some online restore operations. All recovery models support copy-only data backups.
being used. At the end of the backup, SQL Server writes transaction log entries, which cover the period
when the backup was occurring, into the backup.
Scenarios that might be appropriate for using a full database backup strategy include:
Test systems.
Data warehouses where the data can be recovered from a source system and where the data in the
data warehouse does not change regularly.
Example
For example, you could perform a full backup of your database on Sunday, Monday, and Tuesday. This
means that during the day on Monday, up to a full day of data is exposed to risk until the backup is
performed. The same amount of exposure happens on Tuesday. After the Tuesday backup is carried out,
the risk increases every day until the next Sunday backup is performed.
When it is necessary to recover a database, the latest full database backup needs to be restored, along
with the most recent differential backup (if one has been performed). After the database has been
restored, transaction logs that have been backed up since that time are also restored, in order. Because
the restore works on a transactional basis, you can restore a database to a specific point in time, within
the transactions stored in the log backup.
In addition to providing capabilities that enable you to restore the transactions that have been backed up,
a transaction log backup truncates the transaction log. This enables VLFs in the transaction log to be
reused. If you do not back up the log frequently enough, the log files can fill up.
MCT USE ONLY. STUDENT USE PROHIBITED
5-20 Recovery Models and Backup Strategies
Example
For example, you could supplement your nightly full database backups with periodic transaction log
backups during the day. If your system fails, you can recover to the time of your last transaction log
backup. If, however, only the database data files failed, and a tail-log backup could be performed, no
committed data loss would occur.
As log backups typically take longer to restore than other types of backup, it is often advisable to
combine transaction log backups with periodic differential backups. During a recovery, only the
transaction log backups that were taken after the last differential backup need to be restored.
Differential Backups
From the time that a full backup occurs, SQL
Server maintains a map of extents that have been
modified. In a differential backup, SQL Server
backs up only those extents that have changed. However, it is important to realize that, after the
differential backup is performed, SQL Server does not clear that map of modified extents. The map is only
cleared when full backups occur. This means that a second differential backup performed on a database
will include all changes since the last full backup, not just those changes since the last differential backup.
For example, you might take full database backup at midnight on Sunday (early Monday morning). You
could then take differential backups at midnight each other night of the week. The differential backup
taken on Monday night would include all data changed during Monday. The differential backup taken on
Tuesday night would include all data changed on Monday and Tuesday. The differential backup taken on
Friday night would include all data that changed on Monday, Tuesday, Wednesday, Thursday, and Friday.
This means that differential backups can grow substantially in size between each full backup interval.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-21
Combinations of Backups
Differential backups must be combined with other forms of backup. Because a differential backup saves
all data changed since the last full backup was made, it cannot be taken unless a full backup has already
been performed.
Another important aspect to consider is that, when a recovery is required, multiple backups need to be
restored to bring the system back online—rather than a single backup. This increases the risk exposure for
an organization and must be considered when planning a backup strategy.
Differential backups can also be used in combination with both full and transaction log backups.
Managing file and filegroup backups can be complex, and the loss of a single data file backup can cause
serious problems, including making a database unrecoverable.
One way to simplify the process of backing up parts of a database is to use a partial backup, which backs
up only the primary filegroup and the read/write filegroups. However, this is only recommended when the
database contains enough data in read-only filegroups to make a substantial time and administrative
saving. It is also recommended that you use partial backups in conjunction with the simple recovery
model.
One of the key benefits of a partial backup strategy is that, in the event of a failure, you can perform a
piecemeal restore that makes data in the read/write filegroups available before the read-only filegroups
have been restored. This means you can reduce the recovery time for workloads that do not require the
data in the read-only filegroups.
For example: all read-only filegroups are backed up at midnight on Monday, along with a partial backup
that includes only the primary filegroup and all read/write filegroups. At the end of each subsequent day,
a partial differential backup is used to back up modified pages in read/write filegroups.
Question: What kind of database might be a good candidate for a full backup strategy?
MCT USE ONLY. STUDENT USE PROHIBITED
5-22 Recovery Models and Backup Strategies
If you have time, there is another issue that your manager would like you to work on. There is another
instance of SQL Server installed for supporting customer service operations. Your manager is concerned
that existing databases on the CustomerService server instance are configured inappropriately and have
invalid backup strategies, based on their RPO and RTO requirements. You need to review the database
recovery models and backup strategies for the databases on the CustomerService instance, and provide
recommended changes.
Supporting Documentation
Business Database Continuity Requirements for Databases on the Proseware Server Instance (for Exercises
1 and 2):
o The MarketDev database must never be unavailable for longer than eight hours.
o The Research database must never be unavailable for longer than two hours.
o When the Research database is recovered from a failure, all transactions that were completed up
to the end of the previous weekday must be recovered.
Projected Characteristics
Average rate of change to the Research database during office hours 10 MB/hour
Office hours (no full database backups permitted during these hours) 08:00 to 18:00
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-23
Business Database Continuity Requirements for Databases on the CustomerService Server Instance (for
Exercise 3):
Recovery Time Objectives
o The CreditControl database must never be unavailable for longer than two hours.
o The PotentialIssue database must never be unavailable for longer than one hour.
o When the CreditControl database is recovered from a failure, no more than five minutes of
transactions may be lost.
o When the PotentialIssue database is recovered from a failure, no more than 30 minutes of
transactions may be lost.
Projected Characteristics
PotentialIssue database size (at the start of each week after 200 MB
archiving activity is complete)
Average rate of change to the CreditControl database during office 500 MB/hour
hours
Office hours (no full database activity permitted during these hours) 08:00 to 19:00
MCT USE ONLY. STUDENT USE PROHIBITED
5-24 Recovery Models and Backup Strategies
Objectives
After completing this lab, you will be able to:
Plan a backup strategy.
Username: AdventureWorks\Student
Password: Pa55w.rd
Note: To meet the business requirements, each database might need more than one
backup type and schedule.
Results: At the end of this exercise, you will have created a plan to back up two databases.
Results: At the end of this exercise, you will have modified the database recovery models where required.
MCT USE ONLY. STUDENT USE PROHIBITED
5-26 Recovery Models and Backup Strategies
Task 1: Review the RPO and RTO Requirements for the Databases
The supporting documentation includes details of the business continuity requirements for the
databases. Review this documentation.
Results: At the end of this exercise, you will have assessed the backup strategy.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 5-27
Best Practice:
Plan your transaction log size, based on the transaction log backup frequency.
Review Question(s)
Question: When might a full database backup strategy be adequate?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
6-1
Module 6
Backing Up SQL Server Databases
Contents:
Module Overview 6-1
Lesson 1: Backing Up Databases and Transaction Logs 6-2
Module Overview
In the previous module, you learned how to plan a backup strategy for a SQL Server® system. You can
now learn how to perform SQL Server backups, including full and differential database backups,
transaction log backups, and partial backups.
In this module, you will learn how to apply various backup strategies.
Objectives
After completing this module, you will be able to:
Lesson 1
Backing Up Databases and Transaction Logs
Now you have seen how to plan a backup strategy for a SQL Server system, you can learn how to perform
SQL Server backups, including full and differential database backups, transaction log backups, and partial
backups.
Lesson Objectives
After completing this lesson, you will be able to:
The SSMS graphical user interface includes the following pages, on which you can configure backup
options:
General. Use this page to specify the database to be backed up, the backup type, the backup
destination, and other general settings.
Media Options. Use this page to control how the backup is written to the backup device(s); for
example, overwriting or appending to existing backups.
Backup Options. Use this page to configure backup expiration, compression, and encryption.
In SQL Server, you perform backups while other users continue working with the database; and these
other users might experience a performance impact due to the I/O load placed on the system by the
backup operation. SQL Server does place some limitations on the types of commands you can execute
while a backup is being performed. For example, you cannot use the ALTER DATABASE command with
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-3
the ADD FILE or REMOVE FILE options or shrink a database during a backup. Additionally, you cannot
include the BACKUP command in either an explicit or an implicit transaction, or to roll back a backup
statement.
You can only back up databases when they are online, but it is possible to perform a backup of the
transaction log when a database is damaged, if the log file itself is still intact. This is why it is so important
to split data and log files onto separate physical media.
Backup Timing
An important consideration when making a backup is to understand the timing associated with its
contents—the database may be in use while the backup is occurring. For example, if a backup starts at
22:00 and finishes at 01:00, does it contain a copy of the database as it was at 22:00, a copy as it was at
01:00, or a copy from a time between the start and finish?
SQL Server writes all data pages to the backup device in sequence, but uses the transaction log to track
any pages that are modified while the backup is occurring. SQL Server then writes the relevant portion of
the transaction log to the end of the backup. This process makes the backups slightly larger than in earlier
versions, particularly if heavy update activities are happening at the same time. This altered process also
means that the backup contains a copy of the database as it was at a time just before the completion of
the backup—not as it was at the time the backup started.
In very large systems, it is common to have to perform disk-to-disk imaging while the system is in
operation, because standard SQL Server backups might take too long to be effective. With the VDI
programming interface, an application can freeze SQL Server operations momentarily while it takes a
consistent snapshot of the database files. This form of snapshot is commonly used in geographically-
distributed storage area network (SAN) replication systems.
Note: Direct backup to tape is not supported. If you want to store backups on tape, you
should first write it to disk, and then copy the disk backup to tape.
MCT USE ONLY. STUDENT USE PROHIBITED
6-4 Backing Up SQL Server Databases
If a media set spans several backup devices, the backups will be striped across the devices.
Note: No parity device is used while striping. If two backup devices are used together, each
receives half the backup. Both must also be present when attempting to restore the backup.
Every backup operation to a media set must write to the same number and type of backup devices. Media
sets and the backup devices are created the first time a backup is attempted on them. Media sets and
backup sets can also be named at the time of creation and given a description.
The backups on an individual device within a media set are referred to as a media family. The number of
backup devices used for the media set determines the number of media families in a media set. For
example, if a media set uses two backup devices, it contains two media families.
INIT/NOINIT. The INIT option retains the existing media header, but overwrites all existing backup
sets in the media set. By default, SQL Server uses the NOINIT option to avoid accidental backup
deletion. In SQL Server Management Studio, you can select Backup to the existing media set, and
then select Append to the existing backup set to use the NOINIT option, or Overwrite all existing
backups to use the INIT option.
As an example, consider the following code, which backs up a database to a media set that consists of two
files. Assuming the files do not already exist, SQL Server creates them and uses them to define a new
media set. The data from the backup is striped across the two files:
Another backup could be made later, to the same media set. The data from the second backup is again
striped across the two files, and the header of the media set is updated to indicate that it now contains
the two backups:
If a user then tries to create another backup to only one of the backup files in the media set using the
following code, SQL Server will return an error, because all backup sets to a media set must use the same
backup devices:
Before the member of the media set can be overwritten, the FORMAT option has to be added to the
WITH clause in the backup command. This creates a new media set that contains a single file. The original
media set, together with all of the backup sets it contains, is no longer valid:
Use the FORMAT option to overwrite the contents of a backup file and split up the media set, but use it
very carefully. Formatting one backup file of a media set renders the entire backup set unusable.
The following code makes a differential database backup of the AdventureWorks database and stores it in
a file named 'D:\Backups\AW.bak'. The NOINIT option appends the backup to any existing backups in the
media set:
Note: You cannot create a differential database backup unless a full database backup has
been taken first.
A transaction log backup finds the MaxLSN of the last successful transaction log backup, and saves all log
entries beyond that point to the current MaxLSN. The process then truncates the transaction log as far as
is possible (unless the COPY_ONLY or NO_TRUNCATE option is specified). The longest-running active
transaction must be retained, in case the database has to be recovered after a failure.
For example, imagine a scenario where you create a database, and later take a full backup. At this point,
the database can be recovered. If the recovery model of the database is then changed to simple and
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-7
subsequently switched back to full, a break in the log file chain has occurred. Even though a previous full
database backup exists, the database can only be recovered up to the point of the last transaction log
backup, before the change to simple recovery mode.
After switching from simple to full recovery model, you must perform a full database backup to create a
starting point for transaction log backups.
Note: Not all restore scenarios require a tail-log backup. You do not need to have a tail-log
backup if the recovery point is contained in an earlier log backup or if you are moving or
replacing (overwriting) the database and do not have to restore it to a point of time after the
most recent backup.
When performing a tail-log backup of a database that is currently online, you can use the NO_RECOVERY
option to immediately place the database into a restoring state, preventing any more transactions from
occurring until the database is restored.
If the database is damaged, you can use the NO_TRUNCATE option, which causes the database engine to
attempt the backup, regardless of the state of the database. This means that a backup taken while using
the NO_TRUNCATE option might have incomplete metadata.
If you are unable to back up the tail of the log using the NO_TRUNCATE option when the database is
damaged, you can attempt a tail-log backup by specifying the CONTINUE_AFTER_ERROR option.
There are two techniques used to implement this kind of backup solution:
Partial backup. A partial backup backs up only the primary filegroup and filegroups that are set to
read-write. You can also include specific read-only filegroups if required. The purpose of a partial
backup is to make it easy for you to back up the parts of a database that change, without having to
plan the backup of specific files or filegroups. You can perform a full or differential partial backup.
MCT USE ONLY. STUDENT USE PROHIBITED
6-8 Backing Up SQL Server Databases
File and Filegroup backups. With a filegroup backup, you can back up only selected files and
filegroups in a database. This can be useful with very large databases that would take a long time to
back up in full, because you have to back it up in phases. It is also useful for databases that contain
some read-only data, or data that changes at different rates, because you can back up only the read-
write data, or back up frequently updated data more often.
The following code example performs a partial backup that includes the primary filegroup and all
read/write filegroups:
A Partial Backup
BACKUP DATABASE LargeDB
READ_WRITE_FILEGROUPS
TO DISK = 'D:\Backups\LrgRW.bak'
WITH INIT;
The following code example backs up specific filegroups. You can also use the FILE parameter to back up
individual files:
A Filegroup Backup
BACKUP DATABASE LargeDB
FILEGROUP = 'LrgFG2'
TO DISK = 'D:\Backups\LrgFG2.bak'
WITH INIT;
Demonstration Steps
Perform a Full Database Backup
1. Ensure that you have started the 20764C-MIA-DC and 20764C-MIA-SQL virtual machines, log on to
20764C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.
2. In the D:\Demofiles\Mod06 folder, run Setup.cmd as Administrator. Wait for the script to finish,
and then press enter.
3. Start SQL Server Management Studio, and connect to the MIA-SQL database engine using Windows
authentication.
4. In Object Explorer, expand Databases, right-click AdventureWorks, point to Tasks, and click Back
Up.
5. In the Back Up Database - AdventureWorks dialog box, ensure that Backup type is set to Full.
6. In the Destination section, select each existing file path and click Remove, and then click Add.
7. In the Select Backup Destination dialog box, in the File name box, type
D:\Demofiles\Mod06\Demo\AW.bak, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-9
8. In the Back Up Database - AdventureWorks dialog box, on the Media Options page, note that the
default option is to append to an existing media set. In this case, there is no existing media set so a
new one will be created, and there are no existing backup sets to overwrite.
9. In the Back Up Database - AdventureWorks dialog box, on the Backup Options page, note the
default backup name and expiration settings.
10. In the Back Up Database - AdventureWorks dialog box, in the Script drop-down list, click Script
Action to New Query Window, and then click OK.
12. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database.
13. View the D:\Demofiles\Mod06\Demo folder and note the size of the AW.bak file.
1. In SQL Server Management Studio, open the UpdatePrices.sql script file from the
D:\Demofiles\Mod06\Demo folder, and click Execute. This script updates the Production.Product
table in the AdventureWorks database.
2. In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back
Up.
3. In the Back Up Database - AdventureWorks dialog box, in the Backup type list, click Differential.
4. In the Destination section, ensure that D:\Demofiles\Mod06\Demo\AW.bak is the only backup
device listed.
5. In the Back Up Database - AdventureWorks dialog box, on the Media Options page, verify that
the option to append to the existing media set is selected.
6. In the Back Up Database - AdventureWorks dialog box, on the Backup Options page, change the
Name to AdventureWorks-Diff Database Backup.
7. In the Back Up Database - AdventureWorks dialog box, in the Script drop-down list, click Script
Action to New Query Window, and then click OK.
10. View the D:\Demofiles\Mod06\Demo folder, and note that the size of the AW.bak file has
increased, but not much—the second backup only includes the extents containing pages that were
modified since the full backup.
1. In SQL Server Management Studio, switch to the UpdatePrices.sql script you opened previously, and
click Execute to update the Production.Product table in the AdventureWorks database again.
2. In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and click Back
Up.
3. In the Back Up Database - AdventureWorks dialog box, in the Backup type list, click Transaction
Log.
5. In the Back Up Database - AdventureWorks dialog box, on the Media Options page, verify that
the option to append to the existing media set is selected. Also verify that the option to truncate the
transaction log is selected.
6. In the Back Up Database - AdventureWorks dialog box, on the Backup Options page, change the
Name to AdventureWorks-Transaction Log Backup.
7. In the Back Up Database - AdventureWorks dialog box, in the Script drop-down list, click Script
Action to New Query Window, and then click OK.
9. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database.
Note that this time the statement is BACKUP LOG.
10. View the D:\Demofiles\Mod06 folder and note that the size of the AW.bak file has increased, but
not much—the third backup only includes the transaction log entries for data modifications since the
full backup.
11. Keep SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-11
Lesson 2
Managing Database Backups
No matter how many backups you perform, it is essential that you make sure they are readable and
restorable, otherwise the entire backup system is flawed. It is also important to be able to query
information about your backups so you can access the correct data when required.
In this lesson, you will learn how to verify a backup and ensure its integrity, and how to retrieve backup
history and header information.
Lesson Objectives
After completing this lesson, you will be able to:
The worst option is generally regarded as creating a backup over your most recent backup. If the
system fails during the backup, you will often then have lost both your data and your backup.
Insufficient Data on the Backups. Company A performed regular backups, yet no testing of
recovery was ever made. The first time a real recovery was attempted, it was discovered that not all
files that needed to be backed up were in fact being backed up.
Unreadable Backups. Company B performed regular backups but did not test them. When recovery
was attempted, none of the backups were readable. This is often initiated by hardware failures but
can be caused by inappropriate storage of media.
Avoidance strategy: regular backup recovery testing and use of redundant backup hardware.
Unavailable Hardware. Company C purchased a special tape drive to perform backups. When the
time came to restore the backups, that special device no longer worked, and the organization had no
other way to read the backups, even if they were valid.
Avoidance strategy: regular backup recovery testing, combined with recovery and backup onto
current devices.
Misaligned Hardware. Company E performed regular backups and even tested that they could
undertake restore operations from the backups. However, because they tested the restores on the
same device that performed the backups, they did not realize that the device was misaligned and it
was the only one that could read those backups. When a restore was needed, the device that the
backups were performed on had failed.
Avoidance strategy: regular backup recovery testing on a separate system and physical device.
General Considerations
There are several general points to consider regarding the retention period of backups.
When a backup strategy calls for you to perform multiple types of backups, it is important to work
out the combination of backups you will require.
Organizations might have to fulfill legal or compliance requirements regarding the retention of
backups. In most cases, full database backups are kept for a longer time than other types.
Checking the consistency of databases by using the DBCC CHECKDB statement is a crucial part of
database maintenance, and is discussed later in the course.
In addition to deciding how long backups should be retained, you must determine where they are
kept. An important part of meeting the RTO is to consider how long it takes to obtain the physical
backup media if it has to be restored.
You should also make sure that backups are complete. Are all files that are needed to recover the
system (including external operating system files) being backed up?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-13
Using the theory that it is better to have multiple copies of a backup, rather than a single copy, mirroring
a media set can increase availability of your data. However, it is important to realize that mirroring a
media set exposes your system to a higher level of hardware failure risk, because a malfunction of any of
the backup devices causes the entire backup operation to fail.
You can create a mirrored backup set by using the MIRROR TO option of the BACKUP statement:
Note: The mirrored media set functionality is only available in SQL Server Enterprise
Edition.
You can configure SQL Server to assess the checksum value, either during restore operations or during
backup verification operations made with the RESTORE VERIFYONLY command.
You switch on the checksum option by using the WITH CHECKSUM clause of the BACKUP statement:
Using a Checksum
BACKUP DATABASE AdventureWorks
TO DISK = 'D:\Backups\AW.bak'
WITH CHECKSUM;
MCT USE ONLY. STUDENT USE PROHIBITED
6-14 Backing Up SQL Server Databases
Backup Verification
To verify a backup, you can use the RESTORE VERIFYONLY statement that checks the backup for validity
but does not restore it. The statement performs the following checks:
Page identifiers are correct (to the same level as if it were about to write the data).
The checksum value can only be validated if the backup was performed with the WITH CHECKSUM
option. Without the CHECKSUM option during backup, the verification options only check the metadata
and not the actual backup data.
The RESTORE VERIFYONLY statement is similar to the RESTORE statement and supports a subset of its
arguments:
Verifying a Backup
RESTORE VERIFYONLY
FROM DISK = 'D:\Backups\AW.bak'
You can also perform verification steps by using the backup database task in SSMS.
Note: Consider verifying backups on a different system to the one where the backup was
performed. This will eliminate the situation where a backup is only readable on the source
hardware.
backupfile
backupfilegroup
backupmediafamily
backupmediaset
backupset
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-15
You can query these tables to retrieve information about backups that have been performed:
Note: If a database is restored onto another server, the backup information is not restored
with the database, because it is held in the msdb database of the original system.
Option Description
Demonstration Steps
View the Backup and Restore Events Report
2. In the Backup and Restore Events [AdventureWorks] report, expand Successful Backup
Operations and view the backup operations that have been performed for this database.
3. In the Device Type column, expand each of the Disk (temporary) entries to view details of the
backup media set files.
2. Highlight the code under the comment View backup history, and click Execute.
3. View the query results, which show the backups that have been performed for the AdventureWorks
database.
4. Note line 21 of the code restricts the retrieval period currently set to 30 days. This can be modified by
changing the number 30 or by removing this line.
Verify Backup Media
1. Highlight the code under the comment Use RESTORE HEADERONLY, and click Execute.
2. View the query results, which show the backups in the AW.bak backup device.
3. Highlight the code under the comment Use RESTORE FILELISTONLY, and click Execute.
4. View the query results, which show the database files contained in the backups.
5. Highlight the code under the comment Use RESTORE VERIFYONLY, and click Execute.
6. View the message that is returned, which should indicate that the backup is valid.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-17
Lesson 3
Advanced Database Options
SQL Server Backup provides a range of options that can help optimize your backup strategy, including the
ability to perform a copy-only backup, compress backups, and encrypt backups.
Lesson Objectives
After completing this lesson, you will be able to:
Copy-Only Backups
A copy-only SQL Server backup is independent of
the sequence of conventional SQL Server backups.
Usually, taking a backup changes the database
and affects how later backups are restored.
However, you may need to take a backup for a
special purpose without affecting the overall
backup and restore procedures for the database.
Copy-only Backup
BACKUP DATABASE AdventureWorks
TO DISK = 'D:\Backups\AW_Copy.bak'
WITH COPY_ONLY, INIT;
You can make copy-only backups of either the database or the transaction logs. Restoring a copy-only full
backup is the same as restoring any full backup.
Compressing Backups
Backup files can quickly become very large, so SQL
Server gives you the ability to compress them. You
can set the default backup compression behavior
and also override this setting for individual
backups. The following restrictions apply to
compressed backups:
Backup compression is only available in the Enterprise and Standard Editions of SQL Server.
You can use the property pages for the server to view and configure the default backup compression
setting.
To compress a backup, you can use the WITH COMPRESSION option of the BACKUP statement:
Compressing a Backup
BACKUP DATABASE AdventureWorks
TO DISK = 'D:\Backups\AW_Comp.bak'
WITH COMPRESSION;
If your default setting is to compress backups and you want to override this, use the NO_COMPRESSION
option.
The compression level that can be achieved depends upon how compressible the data is in the database.
Some data compresses well, other data does not. A reduction in I/O and backup size of 30 to 50 percent is
not uncommon in typical business systems.
However, any form of compression tends to increase CPU usage. The additional CPU resources that are
consumed by the compression process may adversely impact concurrent operations on systems that are
CPU bound. Most current SQL Server systems are I/O bound, rather than CPU bound, so the benefit of
reducing I/O usually outweighs the increase in CPU requirements by a significant factor.
Demonstration Steps
Use Backup Compression
2. In the Back Up Database - AdventureWorks dialog box, ensure that Backup type is set to Full.
3. In the Destination section, select the existing file path, click Remove, and then click Add.
4. In the Select Backup Destination dialog box, in the File name box, type
D:\Demofiles\Mod06\Demo\AW_Comp.bak, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-19
5. In the Back Up Database - AdventureWorks dialog box, on the Media Options page, note that the
default option is to append to an existing media set. In this case, there is no existing media set, so a
new one will be created—there are no existing backup sets to overwrite.
6. In the Back Up Database - AdventureWorks dialog box, on the Backup Options page, change the
Name to AdventureWorks-Compressed Backup.
7. In the Set backup compression list, click Compress backup.
8. In the Back Up Database - AdventureWorks dialog box, in the Script drop-down list, click Script
Action to New Query Window, and then click OK.
10. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database,
noting that the COMPRESSION option was specified.
11. View the D:\Demofiles\Mod06\Demo folder and note the size of the AW_Comp.bak file. This
should be significantly smaller than the AW.bak file was after the full database backup in the
previous demonstration.
12. Keep SQL Server Management Studio open for the next demonstration.
Encrypting Backups
Backups are a fundamental requirement for
protecting an organization’s data against
hardware failure or natural disaster. However, the
data in the backup may be sensitive, and you must
ensure that the backup media is secured against
unauthorized access to the data it contains. In
most organizations, you can accomplish this goal
by storing backup media in secured file system
locations. However, it is common for organizations
to use an offsite storage solution for backups to
protect against loss of data in the event of a
disaster that affects the entire site (for example, a
flood or fire). In this kind of scenario, or when the data in the backup requires additional security for
compliance reasons, you can encrypt backups so that they can only be restored on a SQL Server instance
that contains the correct encryption key. Backup encryption in SQL Server is based on standard encryption
algorithms, including AES 128, AES 192, AES 256, and Triple DES. To encrypt a backup, you must specify
the algorithm you want to use and a certificate or asymmetric key that can be used to encrypt the data.
1. Create a database master key in the master database. This is a symmetric key that is used to protect
all other encryption keys and certificates in the database.
2. Create a certificate or asymmetric key with which to encrypt the backup. You can create a certificate
or asymmetric key in a SQL Server database engine instance by using the CREATE CERTIFICATE or
CREATE ASYMMETRIC KEY statement. Note that asymmetric keys must reside in an extended key
management (EKM) provider.
3. Perform the backup using the ENCRYPTION option (or select Encryption in the Backup Database
dialog box), and specifying the algorithm and certificate or asymmetric key to be used. When using
the Backup Database dialog box, you must select the option to back up to a new media set.
MCT USE ONLY. STUDENT USE PROHIBITED
6-20 Backing Up SQL Server Databases
You should back up the database master key and encryption keys to a secure location (separate from the
backup media location) so that you can restore the database to a different SQL Server instance in the
event of a total server failure.
The following example code backs up the AdventureWorks database using the AES 128 encryption
algorithm and a certificate named BackupCert:
Demonstration Steps
Create a Database Master Key
3. Select the code under the comment Create a database master key and click Execute.
4. Select the code under the comment Back up the database master key and click Execute.
Create a Certificate
1. Select the code under the comment Create a certificate and click Execute.
2. Select the code under the comment Back up the certificate and its private key and click Execute.
Encrypt a Database Backup
1. In Object Explorer, under Databases, right-click AdventureWorks, point to Tasks, and then click
Back Up.
2. In the Back Up Database - AdventureWorks dialog box, ensure that Backup type is set to Full.
3. In the Destination section, select the existing file path, click Remove, and then click Add.
4. In the Select Backup Destination dialog box, in the File name box, type
D:\Backups\AW_Encrypt.bak, and then click OK.
5. In the Back Up Database - AdventureWorks dialog box, on the Media Options page, click Back up
to a new media set, and erase all existing backup sets.
7. In the Back Up Database - AdventureWorks dialog box, on the Backup Options page, change the
Name to AdventureWorks-Encrypted Backup.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-21
9. In the Encryption section, select the Encrypt backup check box, ensure that the AES 256 algorithm
is selected, and select the AdventureWorks certificate you created previously.
10. In the Back Up Database - AdventureWorks dialog box, in the Script drop-down list, click Script
Action to New Query Window, and then click OK.
12. In the query pane, view the Transact-SQL BACKUP statement that was used to back up the database,
noting that the ENCRYPTION option was specified.
13. Close SSMS without saving any changes.
MCT USE ONLY. STUDENT USE PROHIBITED
6-22 Backing Up SQL Server Databases
Objectives
After completing this lab, you will be able to:
Implement a backup strategy based on full, differential, and transaction log backups.
Password: Pa55w.rd
2. Verify that the backup file has been created, and note its size.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
2. Verify that the report shows the two backups you have created, AdventureWorks-Full Database
Backup and AdventureWorks-Full Database Backup 2.
Results: At the end of this exercise, you will have backed up the AdventureWorks database to
D:\Backups\AdventureWorks.bak using the simple recovery model.
MCT USE ONLY. STUDENT USE PROHIBITED
6-24 Backing Up SQL Server Databases
2. Use SQL Server Management Studio to check the current recovery model of the database, and
change it if necessary.
2. Verify that the backup file has been created, and note its size.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 6-25
o Back up the log to the existing media set, and append the backup to the existing backup sets.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
o Back up the log to the existing media set, and append the backup to the existing backup sets.
RESTORE HEADERONLY
FROM DISK = 'D:\Backups\AWNational.bak';
GO
2. Use the following query to identify the database files that are included in the backups:
RESTORE FILELISTONLY
FROM DISK = 'D:\Backups\AWNational.bak';
GO
3. Use the following query to verify that the backups are valid:
RESTORE VERIFYONLY
FROM DISK = 'D:\Backups\AWNational.bak';
GO
Results: At the end of this exercise, you will have backed up the national database to
D:\Backups\AWNational.bak.
2. This script creates the required read-only components you need for this Lab, and then close the query
pane without saving the file.
2. Verify that the backup file AWReadOnly.bak has been created in the D:\Labfiles\Lab06\Starter
folder.
2. Verify that the backup file AWPartial.bak has been created in the D:\Labfiles\Lab06\Starter folder.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
2. Verify that the backup file AWPartialDifferential.bak has been created in the
D:\Labfiles\Lab06\Starter folder.
2. Use the following query to view the backups on AWPartial.bak, and scroll to the right to view the
BackupTypeDescription column:
Results: At the end of this exercise, you will have backed up the read-only filegroup in the
AdventureWorks database to D:\Backups\AWReadOnly.bak; and you will have backed up the writable
filegroups in the AdventureWorks database to D:\Backups\AWReadWrite.bak.
MCT USE ONLY. STUDENT USE PROHIBITED
6-28 Backing Up SQL Server Databases
Best Practice:
Plan your transaction log size based on the transaction log backup frequency.
Review Question(s)
Question: What are the unique features of transaction log restores?
Module 7
Restoring SQL Server Databases
Contents:
Module Overview 7-1
Lesson 1: Understanding the Restore Process 7-2
Module Overview
In the previous module, you learned how to create backups of Microsoft® SQL Server® databases. A
backup strategy might involve many different types of backup, so it is essential that you can effectively
restore them.
You will often be restoring a database in an urgent situation. You must, however, ensure that you have a
clear plan of how to proceed and successfully recover the database to the required state. A good plan and
understanding of the restore process can help avoid making the situation worse.
Some database restores are related to system failure. In these cases, you will want to return the system as
close as possible to the state it was in before the failure. Some failures, though, are related to human error
and you might wish to recover the system to a point before that error. The point-in-time recovery
features of SQL Server can help you to achieve this.
Because they are typically much larger, user databases are more likely to be affected by system failures
than system databases. However, system databases can be affected by failures, and special care should be
taken when recovering them. In particular, you need to understand how to recover each system database
because you cannot use the same process for all system databases.
In this module, you will see how to restore user and system databases and how to implement point-in-
time recovery.
Objectives
After completing this module, you will be able to:
Restore databases.
Lesson 1
Understanding the Restore Process
When you need to recover a database, you must have a good plan to avoid causing further damage. After
you have completed the preliminary step of attempting to create a tail-log backup, it is most important to
determine which database backups to restore—and their order.
Lesson Objectives
After completing this lesson, you will be able to:
Data Copy
The data copy phase is typically the longest in a
database restore. First, the data files from the
database need to be retrieved from the backups.
Before any data pages are restored, the restore
process reads the header of the backup and SQL
Server recreates the required data and log files. If
instant file initialization (IFI) has not been enabled by granting rights to the SQL Server service account,
the rewriting of the data files can take a substantial amount of time.
After the data and log files are recreated, the data files are restored from the full database backup. Data
pages are retrieved from the backup in order and written to the data files. The log files need to be zeroed
out before they can be used—if the log files are large, this process can also take a substantial time.
If a differential backup is also being restored, SQL Server overwrites the extents in the data files with those
contained in the differential backup.
Redo Phase
At the start of the redo phase, SQL Server retrieves details from the transaction log. In the simple recovery
model, these details are retrieved from either the full database backup or the differential backup. In the
full or bulk-logged recovery model, these log file details are supplemented by the contents of any
transaction log backups that were taken after the full and differential database backups.
In the redo phase, SQL Server rolls all changes that are contained within the transaction log details into
the database pages, up to the recovery point. Typically, the recovery point is the latest time for which
transactions exist in the log.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 7-3
Undo Phase
The transaction log will likely include details of transactions that were not committed at the recovery
point, which, typically, is the time of the failure. In the undo phase, SQL Server rolls back any of these
uncommitted transactions.
Because the action of the undo phase involves rolling back uncommitted transactions and placing the
database online, no more backups can be restored.
During the undo phase, the Enterprise edition of SQL Server will allow the database to come online so
that users can begin to access it. This capability is referred to as the fast recovery feature. Queries that
attempt to access data that is still being undone are blocked until the undo phase is complete. Potentially,
this can cause transactions to time out, but does mean that users can access the database sooner.
In general, you cannot bring a database online until it has been recovered. The one exception to this is
the fast recovery option, which allows users to access the database while the undo phase is continuing.
Recovery does not only occur during the execution of RESTORE commands. If a database is taken offline
and then placed back into an ONLINE state, recovery of the database will also occur. The same recovery
process takes place when SQL Server restarts.
Note: Other events that lead to database recovery include clustering or database mirroring
failovers. Failover clustering and database mirroring are advanced topics that are beyond the
scope of this course.
Types of Restores
The restore scenarios available for a database
depend on its recovery model and the edition of
SQL Server you are using.
In most scenarios involving the simple recovery model, no differential backups are performed. In these
cases, you only restore the last full database backup, after which the recovery phase returns the database
to the state it was in at the time just before the full database backup was completed.
Piecemeal Restore
A piecemeal restore is used to restore and recover the database in stages, based on filegroups, rather than
restoring the entire database at a single time. The primary filegroup is the first that must be restored,
usually along with the read/write filegroups for which you want to prioritize recovery. You can then
restore read-only filegroups.
Page Restore
Another advanced option provides the ability to restore an individual data page. If an individual data
page is corrupt, users will usually see either an 823 error or an 824 error when they execute a query that
tries to access the page. You can try to recover the page using a page restore. If a user query tries to
access the page after the restore starts, they will see error 829, which indicates the page is restoring. If the
page restore is successful, user queries that access the page will again return results as expected. Page
restores are supported under full and bulk-logged recovery models, but not under simple recovery model.
Online Restore
Online restore involves restoring data while the database is online. This is the default option for File, Page,
and Piecemeal restores.
1. Restore the latest full database backup as a base to work from. If only individual files are damaged or
missing, you may be able to restore just those files.
2. If there are differential backups, you only need to restore the latest differential backup.
3. If transaction log backups exist, you need to restore all transaction log backups since the last
differential backup. You also need to include the tail-log backup created at the start of the restore
process—if the tail-log backup was successful. (This step does not apply to databases using the simple
recovery model.)
Lesson 2
Restoring Databases
Most restore operations involve restoring a full database backup, often followed by a differential backup,
and a sequence of transaction log backups. In this lesson, you will learn how to restore these types of
backup and recover a database.
Lesson Objectives
After completing this lesson, you will be able to:
Restoring a Database
The simplest recovery scenario is to restore a
database from a single full database backup. If
no subsequent differential or transaction log
backups need to be applied, you can use the
RECOVERY option to specify that SQL Server
should complete the recovery process for the
database and bring it online. If additional
backups must be restored, you can prevent
recovery from occurring by specifying the NORECOVERY option. If you do not specify either of these
options, SQL Server uses RECOVERY as the default behavior.
In the following example, the AdventureWorks database is restored from the AW.bak backup media:
In the following code example, the existing AdventureWorks database is replaced with the database in
the AW.bak backup media:
Replace Database
RESTORE DATABASE AdventureWorks
FROM DISK = 'D:\Backups\AW.bak
WITH REPLACE';
Note: The WITH REPLACE option needs to be used with caution as it can lead to data loss.
In this example, the AdventureWorks database is being restored from another server. In addition to
specifying the source location for the media set, new locations for each database file are also specified in
the RESTORE statement. Note that the MOVE option requires the specification of the logical file name,
rather than the original physical file path.
WITH MOVE
RESTORE DATABASE AdventureWorks
FROM DISK = 'D:\Backups\AW.bak'
WITH MOVE 'AdventureWorks_Data' TO 'Q:\Data\AdventureWorks.mdf',
MOVE 'AdventureWorks_Log' TO 'U:\Logs\ AdventureWorks.ldf';
The RESTORE command includes an option to specify WITH RECOVERY or WITH NORECOVERY. The WITH
RECOVERY option is the default action and does not need to be specified. This ensures that a database is
brought online immediately after being restored from a full database backup. However, when your
backup strategy requires you to restore additional backups subsequent to the full database backup, it is
important to choose the correct option for each RESTORE command. In most cases, this process is
straightforward. All restores must be performed WITH NORECOVERY except the last restore, which must
be WITH RECOVERY. Until the final backup is restored with the RECOVERY option, the database name will
display as <Database name> (Restoring…) in SSMS.
There is no way to restore additional backups after a WITH RECOVERY restore has been processed. If you
accidentally perform a backup using the WITH RECOVERY option, you must restart the entire restore
sequence.
In this example, the AdventureWorks database is restored from the first file in the media set containing a
full database backup. This media set is stored in the operating system file D:\Backups\AW.bak. The second
file in the media set is the first differential backup, but the changes in this are also contained in the
second differential backup, in the third file. Therefore, the second RESTORE statement only needs to
restore the contents of the third file.
If both the full and differential backup sets are on the same backup media, SSMS automatically selects the
required backups in the Restore Database dialog box, and ensures that the appropriate recovery settings
are applied.
transaction logs will cause the restore process to fail, and require you to restart the recovery from the
beginning.
In the following example, the AdventureWorks database has failed but the log file was accessible, so a
tail-log backup has been stored in AW-TailLog.bak. To restore the database, the latest full backup
(backup set 1 in AW.bak) is restored using the NORECOVERY option, followed by the latest differential
backup (backup set 3 in AW.bak), again with the NORECOVERY option. All subsequent planned
transaction log backups (backup sets 4 and 5 in AW.bak) are then restored in chronological order with
the NORECOVERY option. Finally, the tail-log backup (the only backup set in AW-TailLog.bak) is restored
with the RECOVERY option.
Note: In the previous example, the log file was available after the database failed, so a tail-
log backup could be taken. This means that the database can be recovered to the point of failure.
Had the log file not been available, the last planned transaction log backup (backup set 5 in
AW.bak) would have been restored using the RECOVERY option; all transactions since that
backup would have been lost.
Restore a database.
Demonstration Steps
Create a Tail-log Backup
1. Ensure that the 20764C-MIA-DC and 20764C-MIA-SQL virtual machines are running, and log on to
20764C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.
3. In the User Account Control dialog box click Yes, and then wait until the script finishes.
4. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
5. Click New Query and type the following Transact-SQL code to perform a tail-log backup:
6. Click Execute, and view the resulting message to verify that the backup is successful.
Restore a Database
2. In the Restore Database - AdventureWorks dialog box, note that the restore operation will restore
both the full backup and the transaction log that you recently backed up, and then click OK.
3. In the Microsoft SQL Server Management Studio dialog box, note that the restore operation was
successful, and then click OK.
Lesson 3
Advanced Restore Scenarios
The techniques discussed in the previous lesson cover most common restore scenarios. However, there are
some more complex restore scenarios for which a DBA must be prepared.
This lesson discusses restore scenarios for file and filegroup backups, encrypted backups, individual data
pages, and system databases.
Lesson Objectives
After completing this lesson, you will be able to:
2. Restore each damaged file from the most recent file backup of that file.
3. Restore the most recent differential file backup, if any, for each restored file.
4. Restore transaction log backups in sequence, starting with the backup that covers the oldest of the
restored files and ending with the tail-log backup created in step 1.
To bring the database back to a consistent state, you must restore the transaction log backups that were
created after the file backups. The transaction log backups can be rolled forward quickly, because only the
changes that relate to the restored files or filegroups are applied. Undamaged files are not copied and
then rolled forward. However, you still need to process the whole chain of log backups.
MCT USE ONLY. STUDENT USE PROHIBITED
7-12 Restoring SQL Server Databases
1. Restore the latest partial full database backup, specifying the read/write filegroups to be restored and
using the PARTIAL option to indicate that read-only filegroups will be restored separately.
2. Restore the latest partial differential backup, and log file backups if they exist. Use the RECOVERY
option with the last RESTORE operation to recover the database. Data in read/write filegroups is now
available.
3. Restore each read-only filegroup backup with the RECOVERY option to bring them online.
1. Create a database master key for the master database. This does not need to be the same
database master key that was used in the original instance but, if you are recovering from a complete
server failure, you can restore the original database master key from a backup.
2. Create a certificate or key from a backup. Use the CREATE CERTIFICATE or CREATE ASYMMETRIC
KEY statement to create a certificate or key from the backup you created of the original key used to
encrypt the database. The new certificate or key must have the same name as the original; if you used
a certificate, you must restore both the public certificate and the private key.
3. Restore the database. Now that the encryption key is available on the SQL Server instance, you can
restore the database as normal.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 7-13
The following code sample shows how to restore an encrypted database backup on a new SQL Server
instance:
Demonstration Steps
Restore an Encrypted Backup
1. Start SQL Server Management Studio and connect to the MIA-SQL\SQL2 database engine using
Windows authentication.
2. In Object Explorer, expand Databases and view the existing databases on this instance.
3. Open the Restore Encrypted Backup.sql script file in the D:\Demofiles\Mod07 folder.
4. Select the code under the comment Try to restore an encrypted backup and click Execute. Note
that this fails because the required certificate is not present.
5. Select the code under the comment Create a database master key for master and click Execute.
This creates a database master key for the master database on MIA-SQL\SQL2.
6. Select the code under the comment Import the backed up certificate and click Execute. This
creates a certificate from public and private key backups that were taken from the MIA-SQL instance.
7. Select the code under the comment Restore the encrypted database and click Execute. Note that
this time the restore operation succeeds.
8. In Object Explorer, refresh the Databases folder and verify that the AdventureWorks database has
been restored.
9. Close SQL Server Management Studio, without saving any changes.
MCT USE ONLY. STUDENT USE PROHIBITED
7-14 Restoring SQL Server Databases
The final two steps of backing up and restoring the log are required to ensure that the final log sequence
number (LSN) of the restored pages is set as the REDO target of the transaction log.
Online page restore is only supported in SQL Server Enterprise edition. You can perform an offline page
restore by using the following procedure:
4. Restore each subsequent transaction log backup with the NORECOVERY option.
Restoring a Page
-- Restore pages from the full backup
RESTORE DATABASE AdventureWorks PAGE='1:55, 1:207'
FROM DISK = 'D:\Backups\AdventureWorks.bak'
WITH FILE=1, NORECOVERY;
-- Restore the log to set the correct REDO LSN and recover
RESTORE LOG AdventureWorks
FROM DISK = 'D:\Backups\AW-Log.bak'
WITH RECOVERY;
master
The master database holds all system-level
configurations. SQL Server needs the master
database before a SQL Server instance can run at
all. SQL Server cannot start without the master
database; therefore, if it is missing or corrupt, you
cannot execute a standard RESTORE DATABASE
command to restore it. Before starting to recover
the master database, you must have access to a
temporary master database so that the SQL Server instance will start. This temporary master database
does not need to have the correct configuration—it will only be used to start up the instance to initiate
the recovery process to restore the correct version of your master database. There are two ways that you
can obtain a temporary master database:
You can use the SQL Server setup program to rebuild the system databases, either from the location
that you installed SQL Server from or by running the setup program found at Microsoft SQL
Server\140\Setup Bootstrap\SQL 2017\setup.exe.
Note: Rerunning the setup program will overwrite all your system databases, so you must
ensure that they are regularly backed up and can be restored after you have restored the master
database.
MCT USE ONLY. STUDENT USE PROHIBITED
7-16 Restoring SQL Server Databases
You can use a file-level backup of the master database files to restore the master database. You
must take this file-level backup when the master database is not in use—that is, when SQL Server is
not running—or by using the VSS service.
Note: Copying the master database from another instance is not supported. The VSS
service is beyond the scope of this course.
When you have created a temporary version of the master database, you can use the following
procedure to recover the correct master database:
1. Start the server instance in single-user mode by using the –m startup option.
2. Use a RESTORE DATABASE statement to restore a full database backup of the master database. It is
recommended that you execute the RESTORE DATABASE statement by using the sqlcmd utility.
After restoring the master database, the instance of SQL Server will shut down and terminate your sqlcmd
connection.
model
The model database is the template for all databases that are created on the instance of SQL Server.
When the model database is corrupt, the instance of SQL Server cannot start. This means that a normal
restore command cannot be used to recover the model database if it becomes corrupted. In the case of a
corrupt model database, you must start the instance with the -T3608 trace flag as a command-line
parameter. This trace flag only starts the master database. When SQL Server is running, you can restore
the model database by using the normal RESTORE DATABASE command.
msdb
SQL Server Agent uses the msdb database for scheduling alerts and jobs, and for recording details of
operators. The msdb database also contains history tables, such as those that record details of backup
and restore operations. If the msdb database becomes corrupt, SQL Server Agent will not start. You can
restore the msdb database by using the RESTORE DATABASE statement as you would a user database—
the SQL Server Agent service can then be restarted.
resource
The resource database is read-only and contains copies of all system objects that ship with SQL Server.
This is a hidden database and you cannot perform backup operations on it. It can, however, be corrupted
by failures in areas such as I/O subsystems or memory. If the resource database is corrupt, it can be
restored by a file-level restore in Windows or by running the setup program for SQL Server.
tempdb
The tempdb database is a workspace for holding temporary or intermediate result sets. This database is
recreated every time an instance of SQL Server starts, so there is no need to back up or restore it. When
the server instance is shut down, any data in tempdb is permanently deleted.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 7-17
Lesson 4
Point-in-time Recovery
In the previous lesson, you learned how to recover a database to the latest point in time possible.
However, you may need to recover the database to an earlier point in time. You have also learned that
you can stop the restore process after any of the backups are restored and initiate the recovery of the
database. While stopping the restore process after restoring an entire backup file provides a coarse level
of control over the recovery point, SQL Server provides additional options that allow for more fine-
grained control.
In this lesson, you will learn about how point-in-time recovery works and how to use the options that it
provides.
Lesson Objectives
After completing this lesson, you will be able to:
Describe point-in-time recovery.
Note: If a user error causes the inadvertent deletion of some data, you may not be aware of
when the error actually occurred. Therefore, you will not know which log file contains the
deletion and the point at which to recover the database. You can use the WITH STANDBY option
on each log file restore and inspect the state of the database after each restore operation, to
determine when the error occurred and when to recover the database.
MCT USE ONLY. STUDENT USE PROHIBITED
7-18 Restoring SQL Server Databases
STOPAT Option
You use the STOPAT option to specify a recovery
point that is based on a datetime value. You
might not know in advance which transaction log
backup file contains transactions from the time
where the recovery needs to occur. Therefore,
you can use the syntax of the RESTORE LOG
command to specify the RECOVERY option for
each log restore command in the sequence.
If the specified time is later than the last time contained in the transaction log backup, the restore
command restores the logs, sends a warning message, and the database is not recovered, so that
additional transaction log backups can be applied.
This behavior ensures that the database is recovered up to the requested point, even when STOPAT and
RECOVERY are both specified with every restore.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 7-19
STOPATMARK Option
If you require more precise control over the
recovery point, you can use the STOPATMARK
option of the RESTORE Transact-SQL statement.
Marking a Transaction
BEGIN TRAN UpdPrc WITH MARK 'Start of nightly update process';
If you do not know the name of a transaction that was marked, you can query the dbo.logmarkhistory
table in the msdb database.
The STOPATMARK option is similar to the STOPAT option for the RESTORE command. SQL Server will stop
at the named transaction mark and include the named transaction in the redo phase. If you wish to
exclude the transaction (that is, restore everything up to the beginning of the named transaction), you can
use the STOPBEFOREMARK option instead. If the transaction mark is not found in the transaction log
backup that is being restored, the restore completes and the database is not recovered, so that other
transaction log backups can be restored.
The STOPATMARK feature is mainly used when you want to restore an entire set of databases to a
mutually consistent state, at some earlier point in time. If you need to perform a backup of multiple
databases, so that they can all be recovered to a consistent point, consider marking all the transaction
logs before commencing the backups.
Note: You cannot use the stop at mark functionality in SSMS; it is only available by using
the Transact-SQL statement.
Demonstration Steps
Perform a Point-in-time Recovery
1. Start SQL Server Management Studio and connect to the MIA-SQL database engine using Windows
authentication.
2. In SQL Server Management Studio, open the Point-in-Time Restore.sql script file in the
D:\Demofiles\Mod07 folder.
3. Select and execute the code under the comment Create a database and back it up. This creates a
database with a single table, and performs a full backup.
4. Select and execute the code under the comment Enter some data. This inserts a record into the
Customers table.
5. Select and execute the code under the comment Get the current time. This displays the current date
and time. Make a note of the current time.
6. Wait until a minute has passed, and then select and execute the code under the comment Get the
current time again to verify that it is now at least a minute since you noted the time.
7. Select and execute the code under the comment Enter some more data. This inserts a second record
into the Customers table.
8. Select and execute the code under the comment Backup the transaction log. This performs a
transaction log backup of the database.
10. In Object Explorer, expand Databases and verify that BackupDemo is listed (if not, right-click the
Databases folder, and click Refresh).
11. Right-click the BackupDemo database, point to Tasks, point to Restore, and then click Database.
13. In the Backup Timeline: BackupDemo dialog box, select Specific date and time and set the Time
value to the time you noted earlier (before any data was inserted), and then click OK.
15. When notified that the database has been restored successfully, click OK.
16. In Object Explorer, expand the BackupDemo database, expand the Tables folder, right-click
dbo.Customers, and then click Select Top 1000 Rows. When the results are displayed, verify that
the database was restored to the point in time before any data was inserted.
17. Close SQL Server Management Studio without saving any files.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 7-21
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
2. Use the following Transact-SQL query to try to bring the database online:
3. Review the error message, and then check the contents of the D:\Labfiles\Lab07\Starter\Setupfiles
folder to determine if the HumanResources.mdf file is present. If not, the database cannot be
brought online because the primary data file is lost.
MCT USE ONLY. STUDENT USE PROHIBITED
7-22 Restoring SQL Server Databases
Results: After this exercise, you should have restored the HumanResources database.
2. Use the following Transact-SQL query to try to bring the database online:
3. Review the error message, and then check the contents of the D:\Labfiles\Lab07\Starter\Setupfiles
folder to verify that the InternetSales.mdf file is present. This file has become corrupt, and has
rendered the database unusable.
2. Use the following Transact-SQL code to back up the tail of the transaction log:
USE master;
BACKUP LOG InternetSales TO DISK = 'D:\Labfiles\Lab07\Backups\InternetSales.bak'
WITH NO_TRUNCATE;
2. In this case, the backup history for the database has been lost, so you must specify the backup media
sets for the existing planned backups, in addition to the tail-log backup you have just performed.
The planned backups should be restored using the NORECOVERY option, and then the tail-log
backup should be restored using the RECOVERY option.
Results: After this exercise, you should have restored the InternetSales database.
USE master;
RESTORE DATABASE AWDataWarehouse FILEGROUP='Current'
FROM DISK = 'D:\Labfiles\Lab07\Backups\AWDataWarehouse.bak'
WITH REPLACE, PARTIAL, FILE = 1, NORECOVERY;
Note: This code restores the primary filegroup and the Current filegroup from a full
database backup on the AWDataWarehouse.bak media set. The PARTIAL option indicates that
only the primary and named read/write filegroups should be restored, and the NORECOVERY
option leaves the database in a restoring state, ready for subsequent restore operations of the
read/write filegroup data.
3. Refresh the Databases folder in Object Explorer to verify that the database is in a restoring state.
2. Verify that the database is now shown as online in Object Explorer, and that you can query the
dbo.FactInternetSales table.
3. Verify that you cannot query the dbo.FactInternetSalesArchive table, because it is stored in a
filegroup that has not yet been brought online.
Results: After this exercise, you will have restored the AWDataWarehouse database.
MCT USE ONLY. STUDENT USE PROHIBITED
7-24 Restoring SQL Server Databases
When planning a database recovery solution, consider the following best practices:
Best Practice:
Don’t forget to back up the tail of the log before starting a restore sequence.
If available, use differential restore to reduce the time taken by the restore process.
Use file level restore to speed up restores when not all database files are corrupt.
Perform regular database backups of master, msdb and model system databases.
Create a disaster recovery plan for your SQL Server and make sure you regularly test restoring
databases.
Review Question(s)
Question: What are the three phases of the restore process?
MCT USE ONLY. STUDENT USE PROHIBITED
8-1
Module 8
Automating SQL Server Management
Contents:
Module Overview 8-1
Lesson 1: Automating SQL Server Management 8-2
Module Overview
The tools provided by Microsoft® SQL Server® make administration easy when compared to some other
database engines. However, even when tasks are easy to perform, it is common to have to repeat a task
many times. Efficient database administrators learn to automate repetitive tasks. This can help to avoid
situations where an administrator forgets to execute a task at the required time. Perhaps more
importantly, the automation of tasks helps to ensure that they are performed consistently, each time they
are executed.
This module describes how to use SQL Server Agent to automate jobs, how to configure security contexts
for jobs, and how to implement multiserver jobs.
Objectives
After completing this module, you will be able to:
Describe methods for automating SQL Server management.
Lesson 1
Automating SQL Server Management
You can gain many benefits from the automation of SQL Server management. Most of the benefits center
on the reliable, consistent execution of routine management tasks. SQL Server is a flexible platform that
provides a number of ways to automate management, but the most important tool for this is the SQL
Server Agent. All database administrators working with SQL Server must be familiar with the configuration
and ongoing management of SQL Server Agent.
Lesson Objectives
After completing this lesson, you will be able to:
Describe the available options for automating SQL Server management and the framework that SQL
Server Agent provides.
Effectively use SQL Server Agent.
Reliable execution of routine tasks. When you perform routine tasks manually, there is always a
chance that you might overlook a vital task. For example, a database administrator could forget to
perform database backups. By using automation, administrators can focus on exceptions that occur
during the routine tasks, rather than on the execution of the tasks.
Consistent execution of routine tasks. Another problem that can occur when you perform routine
tasks manually is that you may not perform the tasks the same way each time. Imagine a situation
where a database administrator archives some data from a set of production tables into a set of
history tables every Monday morning. The new tables must have the same name as the originals with
a suffix that includes the current date.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 8-3
While the administrator might remember to perform this task every Monday morning, there is a
possibility that one or more of the following errors could occur:
Anyone who has been involved in the ongoing administration of systems will tell you that these and other
problems will occur from time to time, even when the tasks are executed by experienced and reliable
administrators. Automating routine tasks can assist greatly in making sure that they are performed
consistently every time.
Proactive Management
After you automate routine tasks, it is possible that their execution fails but no one notices. For example,
an automated backup of databases may fail but this failure is not noticed until one of the backups is
needed.
In addition to automating the routine tasks, you must ensure that you create notifications for when the
tasks fail, even if you cannot imagine a situation where they might. For example, you may create a backup
strategy that produces database backups in a given folder. The job might run reliably for years until
another administrator inadvertently deletes or renames the target folder. You have to know as soon as
this problem occurs, so that you can rectify the situation.
A proactive administrator will try to detect potential problems before they occur. For example, rather than
receiving a notification that a job failed because a disk was full, an administrator might schedule regular
checks of available disk space and make sure a notification is received when it is starting to get too low.
SQL Server provides alerts on system and performance conditions for this type of scenario.
Maintenance Plans
SQL Server provides a Maintenance Plan Wizard that prompts administrators to consider regular tasks to
maintain the health of SQL Server. Behind the scenes, the wizard creates SQL Server Agent jobs and a SQL
Server Integration Service (SSIS) package to orchestrate these jobs and manage workflow.
MCT USE ONLY. STUDENT USE PROHIBITED
8-4 Automating SQL Server Management
PowerShell
This command-line utility can be used to script, and therefore automate, many tasks inside and outside
SQL Server. These scripts can be included as tasks in SQL Server Agent jobs, but also scheduled outside
SQL Server to perform more complicated operations.
The SQL Server Agent supplies a management framework that is based on four core object types:
Jobs
You can use jobs to execute command-line scripts, Windows PowerShell® scripts, Transact-SQL scripts,
SQL SSIS packages, and so on. You can also use them to schedule a wide variety of task types, including
tasks involved in the implementation of other SQL Server features. These include replication, Change Data
Capture (CDC), Data Collection, and Policy Based Management (PBM).
Note: Replication, CDC, and PBM are advanced topics that are beyond the scope of this
course.
Schedules
Schedules specify when, and how, a job runs. It’s a many-to-many relationship, in that a schedule can run
many jobs, and a job can be included in many schedules.
Alerts
The alert system provided by SQL Server Agent is capable of responding to a wide variety of alert types,
including SQL Server error messages, SQL Server performance counter events, and Windows Management
Instrumentation (WMI) alerts.
Operators
You can configure an action to happen in response to an alert, such as the execution of a SQL Server
Agent job or sending a notification to an administrator. In SQL Server Agent, administrators that you can
notify are called operators. One common way of notifying operators is by using Simple Mail Transfer
Protocol (SMTP)-based email. (Alerts and operators are discussed later in this course.)
Note: You can use other SQL Server features to automate complex monitoring tasks—for
example, Extended Events—but this is beyond the scope of this course.
Demonstration Steps
Create a Job
1. Ensure that the 20764C-MIA-DC and 20764C-MIA-SQL virtual machines are running, and then log
on to 20764C-MIA-SQL as AdventureWorks\Student with the password Pa55w.rd.
2. In File Explorer, navigate to D:\Demofiles\Mod08, right-click Setup.cmd, and then click Run as
administrator.
5. In the Connect to Server dialog box, in the Server name box, type MIA-SQL, and then click
Connect.
6. In Object Explorer, expand SQL Server Agent, and then expand Jobs.
MCT USE ONLY. STUDENT USE PROHIBITED
8-6 Automating SQL Server Management
9. In the New Job dialog box, in the Name box, type Manual AdventureWorks Backup, and in the
Category list, click Database Maintenance.
11. In the New Job Step dialog box, in the Step name box, type Backup AdventureWorks, and in the
Database list, click AdventureWorks.
12. In the Command box, type the following, and then click OK:
14. In Object Explorer, under Jobs, note that the new job, Manual AdventureWorks Backup is
displayed.
1. In Object Explorer, under Jobs, right-click Manual AdventureWorks Backup, and then click Start
Job at Step.
2. In the Start Jobs - MIA-SQL dialog box, note that the status of each step when it finishes changes to
Success, and then click Close.
3. In Object Explorer, under SQL Server Agent, double-click Job Activity Monitor.
4. In the Job Activity Monitor - MIA-SQL window, review the information available for SQL Server
Agent jobs.
5. In the Agent Job Activity table, review the information in the Manual AdventureWorks Backup
row, and then click Close.
Categorize Activity
What are the four core objects types provided by SQL Server Agent?
Items
1 Jobs
2 Maintenance Plans
3 Schedules
4 Backup Tasks
5 Alerts
6 Logs
7 Operators
8 SCOM Reporting
Category 1 Category 2
Provided Not
by SQL provided
Server by SQL
Agent Server
Agent
MCT USE ONLY. STUDENT USE PROHIBITED
8-8 Automating SQL Server Management
Lesson 2
Working with SQL Server Agent
Because SQL Server Agent is the primary tool for automating tasks within SQL Server, database
administrators must be proficient at creating and configuring SQL Server Agent jobs. You can create jobs
to implement a variety of different types of task and categorize them for ease of management.
In this lesson, you will learn how to create, schedule, and script jobs.
Lesson Objectives
After completing this lesson, you will be able to:
Script jobs.
It is important to learn to script jobs that have been created so that, if a failure occurs, you can quickly
recreate the job and reconstruct it in other environments. For example, you may create your jobs in a test
environment, but then want to move them to your production environment.
Executing SQL Server Integration Services and Analysis Services commands and queries.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 8-9
Note: While the ability to execute ActiveX® scripts is retained for backwards compatibility,
this option is deprecated and you should not use it for new development.
Creating Jobs
You can use SSMS to create jobs or you can execute the sp_add_job system stored procedure, in addition
to other system stored procedures, to add steps and schedules to the job. After you create the job, SQL
Server stores the job definition in the msdb database, alongside all the SQL Server Agent configuration.
This example shows how to create a simple job in Transact-SQL. This only creates a job; a job step will be
added in the next topic.
Using sp_add_job
USE msdb;
GO
EXEC sp_add_job
@job_name = 'HR database backup',
@enabled = 1,
@description = 'Backup the HR database',
GO
A third option is to make use of the SQL Server Management Objects, and call the Create method from
C# or PowerShell.
Job Categories
You can organize your jobs into categories either by using the SQL Server built-in categories, such as
Database Maintenance, or by defining your own.
This is useful when you have to perform actions that are associated with jobs in a specific category. For
example, you could create a job category called SQL Server Policy Check and write a PowerShell script to
execute all the jobs in that category against your SQL Server servers.
Argument Description
Argument Description
@subsystem Subsystem for SQL Server Agent to use to execute the command (for
example, TSQL, Dts, CMDEXEC or PowerShell). TSQL is the default value.
@command Command to execute. SQL Server Agent provides token substitution that
gives you the same flexibility that variables provide when you write
software programs.
@on_success_action Action to perform if the step succeeds (for example, quit or go to the next
step).
@on_fail_action Action to perform if the step fails (for example, quit or go to the next step).
For a full list of the available parameters see sp_add_jobstep (Transact-SQL) in Microsoft Docs:
sp_add_jobstep (Transact-SQL)
http://aka.ms/Cmljv9
When successful, SQL Server advances by default to the next job step, and stops when a job step fails.
However, job steps can continue with any step defined in the job, using the success or failure flags. By
configuring the action to occur on the success and failure of each job step, you can create a workflow that
determines the overall logic flow of the job. Note that, in addition to each job step having a defined
outcome, the overall job reports an outcome. This means that, even though some job steps may succeed,
the overall job might still fail.
You can specify the number of times that SQL Server should attempt to retry execution of a job step if the
step fails. You can also specify the retry intervals (in minutes). For example, if the job step requires a
connection to a remote server, you could define several retry attempts in case the connection fails.
Using sp_add_jobstep
USE msdb;
GO
EXEC sp_add_jobstep
@job_name = 'HR database backup',
@step_name = 'Set HR database to read only',
@subsystem = 'TSQL',
@command = 'ALTER DATABASE HR SET READ_ONLY',
@retry_attempts = 2,
@retry_interval = 2;
GO
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 8-11
One-time execution.
Start automatically when SQL Server Agent
starts.
Even though a job might have multiple schedules, SQL Server will limit it to a single concurrent execution.
If you try to run a job manually while it is running as scheduled, SQL Server Agent refuses the request.
Similarly, if a job is still running when it is scheduled to run again, SQL Server Agent refuses to let it do so.
The following example shows how to create and attach a schedule for Shift 1 to the job created in earlier
examples:
EXEC sp_add_schedule
@schedule_name = 'Shift 1',
@freq_type = 4,
@freq_interval = 1,
@freq_subday_type = 0x8,
@freq_subday_interval = 1,
@active_start_time = 080000,
@active_end_time = 170000;
GO
EXEC sp_attach_schedule
@job_name = 'HR database backup',
@schedule_name = 'Shift 1';
GO
MCT USE ONLY. STUDENT USE PROHIBITED
8-12 Automating SQL Server Management
Scripting Jobs
There are two approaches to scripting jobs. The
first is to script a standard database action as a
new job; the second is to script existing jobs. The
easiest way to complete these approaches is by
using SSMS, although you can complete these
tasks by using PowerShell.
If the last option is selected, the New Job dialog is opened and prepopulated with the current task as the
first step in the job.
Demonstration Steps
Script a Task to a Job
1. In SSMS, in Object Explorer, expand Databases, right-click AdventureWorks, point to Tasks, and
then click Back Up.
2. In the Back Up Database - AdventureWorks dialog box, in the Destination section, click the
existing backup destination, click Remove, and then click Add.
3. In the Select Backup Destination dialog box, in the File name box, type
D:\Demofiles\Mod08\Backups\AdventureWorksScript.bak, and then click OK.
4. In the Back Up Database - AdventureWorks dialog box, on the toolbar, in the Script drop-down
list, select Script Action to Job.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 8-13
5. In the New Job dialog box, on the General page, note the default name for the job (Back Up
Database - AdventureWorks).
6. On the Steps page, note that the job includes one Transact-SQL step named 1.
8. In the New Job Schedule dialog box, in the Name box, type Week Days.
9. In the Frequency section, select the Monday, Tuesday, Wednesday, Thursday, and Friday check
boxes, clear the Sunday check box, and then click OK.
1. In Object Explorer, under Jobs, right-click Check AdventureWorks DB, point to Script Job as, point
to CREATE To, and then click New Query Editor Window. This generates the Transact-SQL code
necessary to create the job.
2. In Object Explorer, right-click Back Up Database - AdventureWorks, point to Script Job as, point
to CREATE To, and then click Clipboard.
3. Place the insertion point at the end of the Transact-SQL code in the query editor window, and then
on the Edit menu, click Paste.
4. Save the Transact-SQL script as Create Jobs.sql in the D:\Demofiles\Mod08 folder.
5. This technique is useful to generate script creation jobs so that they can be recreated if they are
accidentally deleted or are required on a different server.
6. Keep SQL Server Management Studio open for the next demonstration.
Sequencing Activity
Put the following SQL Server Agent steps in the order required to create a job, by numbering each to
indicate the correct order.
Steps
Create a job.
Create or select an
operator for notifications.
Lesson 3
Managing SQL Server Agent Jobs
When you automate administrative tasks, you must ensure that they execute correctly. To help with this,
SQL Server writes entries to history tables in the msdb database on the completion of each job.
In this lesson, you will learn how to query the history tables, and how to troubleshoot any issues that may
occur.
Lesson Objectives
After completing this lesson, you will be able to:
The Object Explorer window in SSMS also provides a Job Activity Monitor. This displays a view of currently
executing jobs and data showing the results of the previous execution, along with the scheduled time for
the next execution of the job.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 8-15
Note: The INNER JOIN on the dbo.sysjobhistory table has the clause history.step_id <>
0. The history table has entries where the step_id = 0. These rows contain the overall outcome of
a job. By returning all rows where step_is <> 0, the previous query returns information about
each step of the job.
For more information about the system tables that store SQL Server Agent data, see SQL Server Agent
Tables (Transact-SQL) in Microsoft Docs:
That the service account for the service is valid, that the password for the account has not changed,
and that the account is not locked out. If any of these checks are the issue, the service will not start
and details about the problem will be written to the computer’s system event log.
That the msdb database is online. If the msdb database is corrupt, suspect, or offline, SQL Server
Agent will not start.
That the job is scheduled. The schedule may be incorrect or the time for the next scheduled execution
may be in the future.
That the schedule is enabled. Both jobs and schedules can be disabled and a job will not run on a
disabled schedule.
Run jobs.
Demonstration Steps
Run Jobs
3. In Object Explorer, right-click Check AdventureWorks DB, and then click Start Job at Step.
4. Note that this job does not start automatically because it has more than one job step.
5. In the Start Job on 'MIA-SQL' dialog box, in the Start execution at step table, click 1, and then
click Start. Note that the job fails, and then click Close.
Troubleshoot a Failed Job
1. In Object Explorer, right-click Back Up Database - AdventureWorks, and then click View History.
2. In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent instance of the
job, note that all steps succeeded, and then click Close.
3. In Object Explorer, right-click Check AdventureWorks DB, and then click View History.
4. In the Log File Viewer - MIA-SQL dialog box, expand the date for the most recent failed instance of
the job, and note that the step 3 failed.
5. Select the step that failed, and in the pane at the bottom of the dialog box, view the message that
was returned. You may have to scroll to the bottom. Then click Close.
7. In the Job Properties - Check AdventureWorks DB dialog box, on the Steps page, in the Job step
list table, click step 3, and then click Edit.
9. In the Parse Command Text dialog box, note the same error message that was shown in Job History,
and then click OK.
10. In the Job Step Properties - Check DB dialog box, modify the text in the Command box as follows,
and then click OK:
11. In the Job Properties - Check AdventureWorks DB dialog box, click OK.
12. In Object Explorer, right-click Check AdventureWorks DB, and then click Start Job at Step.
13. In the Start Job on 'MIA-SQL' dialog box, in the Start execution at step table, click 1, and then
click Start.
16. In the Job Activity Monitor - MIA-SQL dialog box, in the Agent Job Activity table, note the status
of the Check AdventureWorks DB job, and then click Close.
17. In the D:\Demofiles\Mod08\AdventureWorks folder, view the text files generated by the job.
18. In the D:\Demofiles\Mod08\Backups folder, verify that a backup file was created by the Back Up
Database - AdventureWorks job.
19. Keep SQL Server Management Studio open for the next demonstration.
Categorize Activity
What are four steps that can be undertaken to troubleshoot failed jobs?
Items
Category 1 Category 2
Lesson 4
Multiserver Management
You may have jobs that run across multiple servers that would benefit from being automated. SQL Server
provides multiserver administration functionality for you to distribute jobs across your enterprise.
In this lesson, you will learn about the concepts behind multiserver administration and how to implement
jobs across multiple servers.
Lesson Objectives
After completing this lesson, you will be able to:
It is recommended that if you have a large number of target servers, you should avoid defining your
master server on a production server that has significant performance requirements. This is because target
server traffic may slow performance on the production server.
MCT USE ONLY. STUDENT USE PROHIBITED
8-20 Automating SQL Server Management
Demonstration Steps
Use the Master Server Wizard
1. In SQL Server Management Studio, in Object Explorer, under MIA-SQL, right-click SQL Server Agent,
point to Multi Server Administration, and then click Make this a Master.
2. In the Master Server Wizard - MIA-SQL dialog box, on the Welcome to the Master Server Wizard
page, click Next.
3. On the Master Server Operator page, in the E-mail address box, type
[email protected], and then click Next.
4. On the Target Servers page, expand Database Engine, expand Local Server Groups, click mia-
sql\sql2, click the >, and then click Next.
1. In Object Explorer, on the toolbar, click Connect, and then click Database Engine.
2. In the Connect to Server dialog box, in the Server name list, select MIA-SQL\SQL3, and then click
Connect.
3. In Object Explorer, under MIA-SQL\SQL3, expand Databases, expand System Databases, right-click
msdb, and then click New Query.
4. In the new query window, type the following command, and then click Execute:
5. In Object Explorer, under MIA-SQL, right-click SQL Server Agent (MSX), point to Multi Server
Administration, and then click Manage Target Servers.
6. In the Target Server Status - MIA-SQL dialog box, note that both MIA-SQL\SQL2 and MIA-
SQL\SQL3 are listed as target servers.
If you change the definition of a multiserver job after distributing it to the target servers, you have to
ensure that SQL Server adds the change to the download list for the target servers to be updated.
MCT USE ONLY. STUDENT USE PROHIBITED
8-22 Automating SQL Server Management
You can do this by executing the sp_post_msx_operation stored procedure as shown in the following
example:
You only have to execute this code when you add, update, or remove job steps or schedules;
sp_update_job and sp_delete_job automatically add entries to the download list.
Note: You can locate the job_id property by querying the dbo.sysjobhistory or
dbo.sysjobs tables or by viewing the job properties in SSMS.
Shrinking databases.
Reorganizing indexes.
Rebuilding indexes.
Updating statistics.
Cleaning up history.
Databases backups.
After completing the wizard, you will see the SQL Server Agent jobs and schedules the maintenance plan
has created in the Object Explorer in SSMS.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 8-23
The advantage of this approach is that you can define a set of actions once, and then run those same
steps across your whole organization—with all the executing, monitoring and reporting of these jobs
being undertaken on one master server.
Objectives
After completing this lab, you will be able to:
Test jobs.
Schedule jobs.
Configure master and target servers.
Password: Pa55w.rd
3. Add a Transact-SQL step that runs in the HumanResources database and executes the following
command. The output from the command should be saved as a text file in the D:\ folder:
4. Add an operating system command step that runs the following command to move the backup file to
the D:\Labfiles\Lab08\Starter folder:
5. Ensure that the job is configured to start with the Transact-SQL backup step.
6. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have created a job named Backup HumanResources.
Note: Hint: You may have to expand the log file row to see the details of each step
executed.
3. Check that the HumanResources.bak file has been moved to the correct folder
D:\Labfiles\Lab08\Starter.
4. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have tested the SQL Server Agent job and confirmed that it
executes successfully.
MCT USE ONLY. STUDENT USE PROHIBITED
8-26 Automating SQL Server Management
2. Add a schedule to the Backup HumanResources job so that the job runs every day, two minutes
from the current system time.
3. Wait for the scheduled time, and then proceed with the next task.
3. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have created a schedule for the Backup HumanResources
job.
3. Set the destination of the backup files to be the D:\ drive, and select create a sub-directory for each
database.
4. Set the destination for logging to the D:\ drive.
5. Wait five minutes to check the status of the job, by looking at the contents of the D:\ folder.
2. What are the differences in the history shown on the master and target servers?
Results: At the end of this exercise, you will have created and executed a maintenance plan across
multiple servers.
MCT USE ONLY. STUDENT USE PROHIBITED
8-28 Automating SQL Server Management
Best Practice: When using a large number of target servers, avoid defining your master
server on a production server, because the target server traffic may impact performance on the
production server.
MCT USE ONLY. STUDENT USE PROHIBITED
9-1
Module 9
Configuring Security for SQL Server Agent
Contents:
Module Overview 9-1
Lesson 1: Understanding SQL Server Agent Security 9-2
Module Overview
Other modules in this course have demonstrated the need to minimize the permissions that are granted
to users, following the principle of “least privilege.” This means that users have only the permissions that
they need to perform their tasks. The same logic applies to the granting of permissions to SQL Server
Agent. Although it is easy to execute all jobs in the context of the SQL Server Agent service account, and
to configure that account as an administrative account, a poor security environment would result from
doing this. It is important to understand how to create a minimal privilege security environment for jobs
that run in SQL Server Agent.
Objectives
After completing this module, you will be able to:
Configure credentials.
Lesson 1
Understanding SQL Server Agent Security
SQL Server Agent is a powerful tool for scheduling and executing a diverse range of administrative and
line-of-business tasks. However, to make best use of it, you must understand the security model that is
used for SQL Server Agent jobs and tasks. Many of the tasks that are executed by SQL Server Agent are
administrative in nature, but a number of other tasks are performed on behalf of users. The need to be
able to execute a wide variety of task types leads to a requirement for flexible security configuration.
Jobs need to be able to access many types of objects. In addition to objects that reside inside SQL Server,
jobs often need to access external resources, such as operating system files and folders. These operating
system (and other) dependencies also require a configurable and layered security model, to avoid the
need to grant too many permissions to the SQL Server Agent service account.
Lesson Objectives
At the end of this lesson, you will be able to:
Note: The Local System account option is supported for backward compatibility only. For
security reasons, the Network Service account option is also not recommended: Network Service
has more capabilities than the service account requires. An account that has only the required
permissions should be created and used instead.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 9-3
A Windows domain account should be used and configured with the least possible privileges that will still
allow operation. The account must be a member of the sysadmin fixed server role. It also requires the
following Windows permissions:
Log on as a service
Note: The SQL Server Agent account cannot use SQL Server Authentication for its
connection to the SQL Server Database Engine.
In many circumstances, SQL Server Agent jobs that run tasks that trigger Windows programs are executed
in the security context of the service account. The options for configuring security for SQL Server Agent
job steps are discussed later in this module.
SQLAgentUserRole
SQLAgentReaderRole
SQLAgentOperatorRole
Each role has different permissions assigned to it, and roles that have greater privileges inherit the
permissions of less privileged roles. For example, the SQLAgentOperatorRole role inherits the
permissions of SQLAgentUserRole and SQLAgentReaderRole, in addition to the permissions that are
assigned to it directly.
Note: To see and use the SQL Server Agent section of Object Explorer in SQL Server
Management Studio, a user must be a member of one of the fixed SQL Server Agent roles, or a
member of sysadmin.
MCT USE ONLY. STUDENT USE PROHIBITED
9-4 Configuring Security for SQL Server Agent
SQLAgentUserRole
Members of SQLAgentUserRole have permission to edit, delete, execute, and view history of local jobs
and job schedules of which they are the owner. They can also create new jobs and job schedules; the user
will be the owner of any jobs and schedules that he or she creates.
SQLAgentReaderRole
Members of SQLAgentReaderRole have all of the permissions of SQLAgentUserRole; in addition, they
can view job definitions, job schedule definitions, and job history for jobs that other users own—including
multiserver jobs.
SQLAgentOperatorRole
Members of SQLAgentOperatorRole have all of the permissions of SQLAgentUserRole and
SQLAgentReaderRole; in addition, they can enable and disable job definitions and job schedules that
other users own.
For more information about the fixed SQL Server Agent roles, including a complete breakdown of the
permissions of each role, see the topic SQL Server Agent Fixed Database Roles in Microsoft Docs:
Discussion Topics
Question: Which SQL Server resources would SQL
Server Agent jobs potentially depend upon?
You specify an alternate database user name for a Transact-SQL job step in the Run as user box on the
Advanced page of the Job step properties dialog box, or by using the @database_user_name parameter
of the sp_add_jobstep and sp_update_jobstep system stored procedures.
Note: Although the General page of the Job step properties dialog box has a Run As
box, this property is not used from Transact-SQL job steps.
Note: As discussed earlier in the lesson, the range of permissions required to carry out
different job steps can lead to the SQL Server Agent service account being highly privileged.
Proxy Accounts
As an alternative to using the SQL Server Agent account, you can use a proxy account to associate a job
step with a Windows identity by using an object called a credential. You can associate a proxy account
with one or more of the non-Transact-SQL subsystems that are used in job steps. Using proxy accounts
means that you can use different Windows identities to perform the various tasks that are required in jobs.
It provides better control of security by avoiding the need for a single account to have all of the
permissions that are required to execute all jobs. Members of the SQL Server Agent fixed roles who are
not members of sysadmin can be granted permission to use proxy accounts to execute job steps that are
not based on Transact-SQL.
Creating and configuring credentials and proxy accounts is covered later in this module.
To select a proxy account for a step that is not based on Transact-SQL, you use the Run As box on the
General page of the Job step properties dialog box, or the @proxy_name parameter of the
sp_add_jobstep and sp_update_jobstep system stored procedures.
MCT USE ONLY. STUDENT USE PROHIBITED
9-6 Configuring Security for SQL Server Agent
For more information about configuring job steps by using sp_add_jobstep, see the topic sp_add_jobstep
(Transact-SQL) in Microsoft Docs:
sp_add_jobstep (Transact-SQL)
http://aka.ms/cmljv9
4. Check the securable objects that each job step task accesses. This includes any Transact-SQL objects
that need to be accessed and any Windows files (including network paths) or other resources that
need to be accessed.
5. For each failing step, check that the account that is being used to execute the step can access the
resources that you have determined as necessary for the step.
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
2. In the D:\Demofiles\Mod09 folder, right-click Setup.cmd, and then click Run as Administrator.
4. Start Microsoft SQL Server Management Studio, and connect to the MIA-SQL Database Engine
instance by using Windows authentication.
5. On the File menu, point to Open, and then click Project/Solution.
8. In Object Explorer, expand SQL Server Agent, expand Jobs, right-click Record Execution Identity,
and then click Start Job at Step.
9. In the Start Jobs - MIA-SQL dialog box, make sure that the job ran successfully, and then click Close.
The job triggers the dbo.RecordIdentity stored procedure in the AdminDB database. The procedure
logs the identity of whoever ran the procedure to dbo.IdentityLog.
10. In Object Explorer, right-click Record Execution Identity, and then click View History.
11. In the Log File Viewer - MIA-SQL window, expand the visible job execution by clicking the plus sign
on the row in the right pane, and then scroll the window to the right so that the Message column is
visible.
Notice that the first row shows that the job was invoked by ADVENTUREWORKS\Student, but the
job step row shows that it was executed as ADVENTUREWORKS\ServiceAcct. When a sysadmin
user owns a job, the job steps are executed in the context of the SQL Server Agent service account by
default.
13. In Object Explorer, right-click Record Execution Identity, and then click Properties.
14. In the Job Properties - Record Execution Identity window, on the General page, clear the Owner
box, type ITSupportLogin, and then click OK.
15. In Object Explorer, right-click Record Execution Identity, and then click Start Job at Step.
16. In the ITSupportLogin dialog box, notice that the job fails, and then click Close.
17. To view the job history to find the reason for the failure, right-click Record Execution Identity, and
then click View History.
18. In the Log File Viewer - MIA-SQL window, expand the failed job by clicking the plus sign next to the
error symbol, scroll the window to the right until the Message column is visible, and then note the
reason for the failure. The failure reason should show as follows:
Executed as user: ITSupportLogin. The EXECUTE permission was denied on the object
'RecordIdentity', database 'AdminDB', schema 'dbo'. [SQLSTATE 42000] (Error 229).
The step failed.
The job was run as the ITSupportLogin login, which maps to the ITSupport user in the AdminDB
database; that user has no permissions to execute the stored procedure, so the job step failed.
19. In the Demo 1 - security context.sql query pane, execute the query under the comment that begins
Task 2 to grant the permission that is necessary to enable the job to run.
20. In Object Explorer, right-click Record Execution Identity, and then click Start Job at Step.
MCT USE ONLY. STUDENT USE PROHIBITED
9-8 Configuring Security for SQL Server Agent
21. In the Start Jobs - MIA-SQL dialog box, notice that the job succeeds, and then click Close.
22. In the Demo 1 - security context.sql query pane, execute the query under the comment that begins
Task 1 to view the contents of the AdminDB.dbo.IdentityLog table.
24. Keep SQL Server Management Studio open for the next demonstration.
Sequencing Activity
Put the following fixed roles in order from least privileged to most privileged by numbering each to
indicate the correct order.
Steps
SQLAgentUserRole
SQLAgentReaderRole
SQLAgentOperatorRole
sysadmin
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 9-9
Lesson 2
Configuring Credentials
For SQL Server Agent job steps to be able to access resources outside SQL Server, the job steps must be
executed in the security context of a Windows identity that has permission to access the required
resources. Windows identities are separate from SQL Server identities, even though SQL Server can utilize
Windows logins and groups. For a job step to be able to use a separate Windows identity, the job step
must be configured to log on as that identity. To be able to log on, the Windows user name and password
need to be stored. Credentials are SQL Server objects that are used to store Windows user names and
passwords.
Note: Credential objects can have other uses outside security in SQL Server Agent; for
example, allowing logins that use SQL Server Authentication to impersonate a domain account.
For more information about credentials, see the topic Credentials (Database Engine) in Microsoft Docs:
Lesson Objectives
After completing this lesson, you will be able to:
Describe credentials
Configure credentials
Manage credentials
Overview of Credentials
A credential is a SQL Server object that contains
the authentication information that is required to
connect to a resource outside SQL Server. Most
credentials have a Windows user name and
password, although they might contain
authentication information for a third-party
cryptographic provider.
Credentials
A SQL Server Agent proxy account that can be
used for job execution maps to a credential in SQL
Server. In the next lesson, you will see how to map
a proxy account to a credential.
SQL Server automatically creates some system credentials that are associated with specific endpoints.
These automatically created system credentials have names that are prefixed with two hash signs (##).
Proxy accounts in SQL Server Agent cannot use system credentials; proxy accounts require server-level
credentials that users define.
MCT USE ONLY. STUDENT USE PROHIBITED
9-10 Configuring Security for SQL Server Agent
Configuring Credentials
Credentials can be created by using the Transact-
SQL CREATE CREDENTIAL statement, or by using
SQL Server Management Studio.
Configuring Credentials
The password for a credential is called a secret and
is strongly encrypted and stored in the master
database. When SQL Server first needs to perform
any type of encryption, the SQL Server Agent
service generates a service master encryption key.
The service master key is also used to protect the
master keys for each database. (Not all databases
have master keys.)
Often an organization will have a policy that requires encryption keys to be replaced on a regular basis. If
the service master key is regenerated, the secrets that are stored for credentials are automatically
decrypted and re-encrypted by using the new service master key.
Note: Encryption in SQL Server is an advanced topic that is outside the scope of this course.
Note also that the encryption of secrets for credentials by using an Extensible Key Management
(EKM) provider is supported, but is also beyond the scope of this course.
The following example creates the FileOperation credential for the Windows account
ADVENTUREWORKS\FileSystemServices with the password Pa55w.rd:
CREATE CREDENTIAL
CREATE CREDENTIAL FileOperation
WITH IDENTITY = 'ADVENTUREWORKS\FileSystemServices',
SECRET = 'Pa55w.rd';
GO
For more information about creating credentials, see the topic Create a Credential in Microsoft Docs:
Create a Credential
http://aka.ms/H7v4s5
For more information about working with encryption keys in SQL Server, see the topic SQL Server and
Database Encryption Keys (Database Engine) in Microsoft Docs:
Managing Credentials
The sys.credentials system view provides catalog
information about existing credentials. For more
information about sys.credentials, see the topic
sys.credentials (Transact-SQL) in Microsoft Docs:
sys.credentials (Transact-SQL)
http://aka.ms/Cr8uot
Modifying Credentials
The password for a Windows account could
change over time. You can update a credential
with new values by using the ALTER CREDENTIAL
statement. In the example on the slide, notice how the FileOperation credential is being updated. Both
the user name and password (that is, the secret) are supplied in the ALTER CREDENTIAL statement.
Although the secret is optional, the ALTER CREDENTIAL command always updates both the identity and
the secret. If the secret is not supplied to an ALTER CREDENTIAL statement, the secret is set to NULL.
For more information about managing credentials by using ALTER CREDENTIAL, see the topic ALTER
CREDENTIAL (Transact-SQL) in Microsoft Docs:
ALTER CREDENTIAL (Transact-SQL)
http://aka.ms/Ci5lfx
Demonstration Steps
1. In SQL Server Management Studio, in Object Explorer, under MIA-SQL, under SQL Server Agent,
right-click Jobs, and then click New Job.
2. In the New Job window, on the General page, in the Name box, type Copy Export File.
4. In the New Job Step window, on the General page, in the Step name box, type Copy File.
copy d:\demofiles\Mod09\ExportData\export.txt
d:\demofiles\Mod09\ImportData\import.txt /Y
Notice that the job is configured to run in the security context of the SQL Server Agent service
account.
7. On the Advanced page, in the On success action box, click Quit the job reporting success, and
then click OK.
9. In Object Explorer, under Jobs, right-click Copy Export File, and then click Start Job at Step.
10. In the Start Jobs - MIA-SQL dialog box, notice that the job fails, and then click Close.
11. To view the job history to find the reason for the failure, right-click Copy Export File, and then click
View History.
12. In the Log File Viewer - MIA-SQL window, expand the failed job by clicking the plus sign next to the
error symbol, scroll the window to the right until the Message column is visible, and then note the
reason for the failure. The failure reason should show as follows:
Close the Log File Viewer - MIA-SQL window. The job step failed because the service account does
not have permission to access the source and target folders in the file system. (One solution would be
to grant access to the folders to the service account, but instead you will create a credential, and then
link it to a proxy account in the next demonstration.)
15. Execute the code under the comment that begins Task 2 to examine the contents of the
sys.credentials catalog view. One row should be returned for the credential that you have just
created.
17. Leave SQL Server Management Studio open for the next demonstration.
Lesson 3
Configuring Proxy Accounts
In the last lesson, you saw that credentials are used in SQL Server to store identities that are external to
SQL Server—typically Windows user names and passwords. To enable a job step in a SQL Server Agent job
to use a credential, the job step is configured to use the security context of the credential through a proxy
account.
Lesson Objectives
After completing this lesson, you will be able to:
ActiveX® Script
Windows PowerShell®
Replication Distributor
Replication Merge
Replication Snapshot
Note: The ActiveX Script subsystem is marked for deprecation in a future version of SQL
Server, and should not be used for new development.
A job step that uses the proxy account can access the specified subsystems by using the security context
of the Windows user. Before SQL Server Agent runs a job step that uses a proxy account, SQL Server
Agent impersonates the credentials that are defined in the proxy account, and then runs the job step by
using that security context.
Note: The Windows user who is specified in the credential must have the Log on as a
batch job permission on the computer on which SQL Server is running.
Note that a user must have permission to use a proxy account before they can specify the proxy account
in a job step. By default, only members of the sysadmin fixed server role have permission to access all
proxy accounts. Permission to use individual proxy accounts can be granted to members of the SQL Server
Agent fixed roles—SQLAgentUserRole, SQLAgentReaderRole, and SQLAgentOperatorRole.
Note: As with other permissions in SQL Server Agent, access to proxy accounts is inherited
from lower-privileged SQL Server Agent fixed roles to higher-privileged SQL Server Agent fixed
roles. For example, if SQLAgentUserRole is granted permission to use the FileOpProxy proxy
account, members of both SQLAgentReaderRole and SQLAgentOperatorRole will also have
permission to use the proxy account.
Proxy accounts can be configured by using SQL Server Management Studio, or through system stored
procedures in the msdb system database.
For more information about creating proxy accounts, see the topic Create a SQL Server Agent Proxy in
Microsoft Docs:
For more information about modifying proxy accounts, see the topic Modify a SQL Server Agent Proxy in
Microsoft Docs:
Demonstration Steps
1. In SQL Server Management Studio, in Solution Explorer, double-click the Demo 3 - proxy.sql script
file.
2. Execute the code under the comment that begins Task 1 to create a new proxy account that is linked
to the FileOperation credential that you created in the last demonstration.
3. Execute the code under the comment that begins Task 2 to examine the dbo.sysproxies catalog
view.
4. Execute the code under the comment that begins Task 3 to examine the contents of the
dbo.syssubsystems system view. Note that the CmdExec subsystem has a subsystem_id of 3.
5. Execute the code under the comment that begins Task 4 to associate the FileOp proxy account with
the CmdExec subsystem (@subsystem_id = 3).
6. In Object Explorer, under Jobs, right-click Copy Export File, and then click Properties.
7. In the Job Properties - Copy Export File window, on the Steps page, click Edit.
8. In the Job Step Properties - Copy File window, in the Type list, click Operating system (CmdExec).
MCT USE ONLY. STUDENT USE PROHIBITED
9-16 Configuring Security for SQL Server Agent
10. In the Job Properties - Copy Export File window, click OK.
12. In SQL Server Management Studio, in Object Explorer, under Jobs, right-click Copy Export File, and
then click Start Job at Step.
13. In the Start Jobs - MIA-SQL dialog box, notice that the job succeeds, and then click Close.
14. In File Explorer, demonstrate that the folder now contains a copy of the file from the ExportData
folder, and then close File Explorer.
15. Close SQL Server Management Studio without saving any changes.
Question: Why are credentials stored in the master system database and proxy accounts
stored in the msdb system database?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 9-17
Objectives
After completing this lab, you will be able to:
Configure credentials.
Configure proxy accounts.
Test the security context of SQL Server Agent jobs and job steps.
3. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
3. Based on the information in the job history, what is the cause of the job failure?
Results: After completing this exercise, you should have identified the cause of the job failure.
Before you can create a proxy account, you must configure a credential.
1. Create a Credential
Hint: Credentials are configured under the server-level Security node in Object Explorer.
Results: After completing this exercise, you should have created a credential that references the
ADVENTUREWORKS\Student Windows account.
2. Configure ExtractProxy to have access to the SQL Server Integration Services Package job step
subsystem.
3. Grant permission to use ExtractProxy to the owner of the SQL Server Agent job called Generate
Sales Log.
Results: After completing this exercise, you should have created a proxy account that is suitable for
correcting the problem with the SQL Server Agent job called Generate Sales Log.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 9-19
Results: After completing this exercise, the Generate Sales Log SQL Server Agent job should be working
correctly, and the sales_log.csv file should be generated to D:\Labfiles\Lab09\Starter\SalesLog each
time the job runs.
ADVENTUREWORKS\Administrator
PromoteApp
ADVENTUREWORKS\Student
Best Practice:
1. Use a Windows domain user as the SQL Server Agent service account.
Create proxy accounts that have least permissions assigned for job execution.
Review Question(s)
Question: As a general rule, why should proxy accounts not be assigned access to all of the
job step subsystems?
MCT USE ONLY. STUDENT USE PROHIBITED
10-1
Module 10
Monitoring SQL Server with Alerts and Notifications
Contents:
Module Overview 10-1
Lesson 1: Monitoring SQL Server Errors 10-2
Module Overview
One key aspect of managing Microsoft SQL Server® in a proactive manner is to make sure you are aware
of problems and events that occur in the server, as they happen. SQL Server logs a wealth of information
about issues. You can configure it to advise you automatically when these issues occur, by using alerts and
notifications. The most common way that SQL Server database administrators receive details of events of
interest is by email message. This module covers the configuration of Database Mail, alerts, and
notifications for a SQL Server instance, and the configuration of alerts for Microsoft Azure SQL Database.
Objectives
At the end of this module, you will be able to:
Lesson 1
Monitoring SQL Server Errors
It is important to understand the core aspects of errors as they apply to SQL Server. In particular, you need
to consider the nature and locations of errors, in addition to the data that they return. SQL Server records
severe errors in the SQL Server error log, so it is also important to know how to configure the log.
Lesson Objectives
After completing this lesson, you will be able to:
What Is an Error?
It might not be immediately obvious that a SQL
Server Database Engine error (or exception) is
itself an object, and therefore has properties that
you can access.
Property Description
Error numbers are helpful when trying to locate information about the specific error, particularly when
searching for information online.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-3
The following example shows how to use the sys.messages catalog view to retrieve a list of system
supplied error messages, showing the error number (message_id), severity (severity), and error message
(text) properties described in the table above in English:
Error messages can be localized and are returned in a range of languages, so the WHERE clause of this
example limits the results to view only the English version.
Note: The State property is not defined in sys.messages. A given error message will
always have the same severity, but might occur with different states.
Error numbers less than 50,000 are reserved for system error messages. You can define custom error
messages that are specific to your applications, using error numbers greater than 50,000.
For more information about sys.messages and Database Engine errors, see the topic Understanding
Database Engine Errors in the SQL Server Technical Documentation:
Severities 0 to 10
Severity values from 0 to 10 are informational
messages raised by the Database Engine to
provide information associated with the running
of a query.
Information Messages
SELECT COUNT(Color) FROM Production.Product
Messages with a severity of 10 or lower are treated as information, and are ignored by the Transact-SQL
TRY…CATCH construct. An error raised with severity 10 is converted to severity 0 before being returned to
a client application.
MCT USE ONLY. STUDENT USE PROHIBITED
10-4 Monitoring SQL Server with Alerts and Notifications
Severities 11 to 16
Severity values from 11 to 16 are used for errors that the user can correct. Typically, SQL Server uses them
when it asserts that the code being executed contains an error, or has experienced an error during
execution. Errors in this range include:
Severities 17 to 19
Severity values 17 to 19 are serious software errors that the user cannot correct, but which can normally
be addressed by an administrator. For example, severity 17 indicates that SQL Server has run out of
resources (such as memory or disk space). When an error with severity in the range 17 to 19 is raised, the
Transact-SQL batch where the error occurred will typically abort, but the database connection is not
affected. Errors with severity 19 are written to the SQL Server error log.
Severities 20 to 24
Severity values of 20 and above indicate very serious errors that involve either the hardware or SQL Server
itself, which threatens the integrity of one or more databases or the SQL Server service. Errors in the
severity range 20 to 24 are written to the SQL Server error log. The connection on which a severity error of
20 or greater occurs will, in almost all circumstances, be automatically closed.
For more information about Database Engine error severity levels, see the topic Database Engine Error
Severities in the SQL Server Technical Documentation:
By default, SQL Server retains backups of the previous six logs and gives the most recent log backup the
extension .1, the second most recent the extension .2, and so on. The current error log has no extension.
You can increase the number of log files to retain by customizing the log configuration, but you cannot
choose to retain fewer than six log files.
The log file cycles with every restart of the SQL Server instance. Occasionally, you might want to remove
excessively large log files. You can use the sp_cycle_errorlog system stored procedure to close the
existing log file and open a new one on demand. If there is a regular need to recycle the log file, you
could create a SQL Server Agent job to execute the system stored procedure on a schedule. Cycling the
log can help you to stop the current error log becoming too large.
For more information about sp_cycle_errorlog, see the topic sp_cycle_errorlog (Transact-SQL) in the SQL
Server Technical Documentation:
sp_cycle_errorlog (Transact-SQL)
https://aka.ms/Lz9o3t
For more information about increasing the number of log files retained before recycling through SSMS,
see the topic Configure SQL Server Error Logs in the SQL Server Technical Documentation:
Demonstration Steps
View the SQL Server Error Log
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the password
Pa55.wrd.
3. Start SQL Server Management Studio and connect to the MIA-SQL Database Engine instance using
Windows authentication.
4. In Object Explorer, under MIA-SQL, expand Management, and then expand SQL Server Logs.
6. In the Log File Viewer - MIA-SQL window, view the log entries. Note that when you select a log
entry, its details are shown in the lower pane.
7. In the Select logs pane, expand SQL Server Agent, and select Current.
8. Scroll the main log entries pane to the right until you can see the Log Type column, and then scroll
down to find an entry with the log type SQL Server Agent.
9. When you have finished viewing the log entries, click Close.
MCT USE ONLY. STUDENT USE PROHIBITED
10-6 Monitoring SQL Server with Alerts and Notifications
12. If the User Account Control dialog box appears, click Yes.
13. Note that the current SQL Server log is stored here in the file named ERRORLOG, and the current SQL
Server Agent log is stored as SQLAGENT.1. The remaining log files contain log entries for other SQL
Server components and services.
2. In the query window, type the following Transact-SQL code, and then click Execute:
EXEC sys.sp_cycle_errorlog;
3. In Object Explorer, under SQL Server Logs, right-click Current, and then click View SQL Server Log.
4. In the Log File Viewer - MIA-SQL window, note that the log has been reinitialized, and then click
Close.
5. Close the query window without saving changes, but leave SSMS open for the next demonstration.
0 to 10
11 to 16
17 to 19
20 to 24
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-7
Lesson 2
Configuring Database Mail
SQL Server can be configured to advise administrators when issues arise that require their attention, such
as the failure of a scheduled job or a significant error. Email is the most commonly-used mechanism for
notifications from SQL Server. You can use the Database Mail feature of SQL Server to connect to an
existing Simple Mail Transport Protocol (SMTP) server when SQL Server needs to send email.
Database Mail is not solely used for sending alerts and notifications to administrators; when Database Mail
is configured, you can send email from Transact-SQL code. To restrict this permission, you can control
which users can utilize the email features of the product, and configure different email profiles for use by
different security principals. Because it can be important to be able to track and trace sent emails, SQL
Server enables you to configure a policy for their retention.
Note: Database Mail is not available in Azure SQL Database—this uses a different alerting
mechanism, discussed later in this module.
Lesson Objectives
After completing this lesson, you will be able to:
Describe Database Mail.
By default, the Database Mail stored procedures are disabled to reduce the surface area of SQL Server;
when they are enabled, only users who are members of the sysadmin server role or the
DatabaseMailUserRole database role in the msdb database can execute them. Database Mail logs email
activity, and also stores copies of all messages and attachments in the msdb database.
You use the Database Mail Configuration Wizard to enable Database Mail and to configure accounts and
profiles. A Database Mail account contains all the information that SQL Server needs to send an email
message to the mail server. You must specify what type of authentication to use (Windows, basic, or
anonymous), the email address, the email server name, type, and port number—and if your SMTP server
MCT USE ONLY. STUDENT USE PROHIBITED
10-8 Monitoring SQL Server with Alerts and Notifications
requires authentication, a username and password. You can also configure Database Mail using the group
of system stored procedures with the naming pattern msdb.dbo.sysmail….
SQL Server stores the configuration details in the msdb database, along with all SQL Server Agent
configuration data. SQL Server Agent also caches the profile information in memory so you can send
email if the SQL Server Database Engine is no longer available.
For a further overview of Database Mail, see the topic Database Mail in the SQL Server Technical
Documentation:
Database Mail
https://aka.ms/Gjok0m
For more information about the process that sends Database Mail, see the topic Database Mail External
Program in the SQL Server Technical Documentation:
For a complete list of the stored procedures available to configure and work with Database Mail, see the
topic Database Mail Stored Procedures (Transact-SQL) in the SQL Server Technical Documentation:
Mail Profiles
You can create multiple configurations by using different profiles. For example, you could create one
profile to send mail to an internal SMTP server—using an internal email address—for mails sent by SQL
Server Agent, and a second profile for a database application to send external email notifications to
customers or suppliers.
Each database user can access multiple profiles. If you do not specify a profile when sending an email
message, Database Mail uses the default profile. If both private and public profiles exist, precedence is
given to a private default profile over a public one. If you do not specify a default profile, or if a
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-9
nondefault profile should be used, you must specify the profile name you want to use as a parameter
when sending mail.
The following example shows how to use the sp_send_dbmail system stored procedure to send an email
message using a specific profile, and using the optional @profile_name parameter:
For more information about profiles in Database Mail, see the topic Database Mail Configuration Objects
in the SQL Server Technical Documentation:
The msdb.dbo.sp_send_dbmail stored procedure has a lot of powerful features, including the ability to
send email messages in HTML or plain text format, and the ability to attach files to email messages. For
more information about the capabilities of sp_send_dbmail, see the topic sp_send_dbmail (Transact-SQL)
in the SQL Server Technical Documentation:
sp_send_dbmail (Transact-SQL)
https://aka.ms/H9ejv2
You can limit the types and size of attachments that users can send in email messages by Database Mail.
You can configure this limitation by using the Database Mail Configuration Wizard or by calling the
dbo.sysmail_configure_sp system stored procedure in the msdb database.
For more information about dbo.sysmail_configure_sp, see the topic sysmail_configure_sp (Transact-
SQL) in the SQL Server Technical Documentation:
sysmail_configure_sp (Transact-SQL)
https://aka.ms/neysmz
You configure the logging level parameter by using the Configure System Parameters page of the
Database Mail Configuration Wizard or by calling the dbo.sysmail_configure_sp stored procedure in the
msdb database. You can view the logged messages by querying the dbo.sysmail_event_log table.
Internal tables in the msdb database also hold copies of the email messages and attachments that
Database Mail sends, together with the current status of each message. Database Mail updates these
tables when it processes messages. You can track the delivery status of an individual message by viewing
information in the following system views in msdb:
Because Database Mail retains the outgoing messages and their attachments, you need to plan a
retention policy for this data. If the volume of Database Mail messages and related attachments is high,
plan for substantial growth of the msdb database.
To restrict the growth of msdb due to retained Database Mail messages, you can periodically delete
messages to regain space and to comply with your organization's document retention policies.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-11
The following example shows how to delete messages, attachments, and log entries that are more than
one month old:
EXECUTE dbo.sysmail_delete_mailitems_sp
@sent_before = @CutoffDate;
EXECUTE dbo.sysmail_delete_log_sp
@logged_before = @CutoffDate;
You could schedule these commands to be executed periodically by creating a SQL Server Agent job.
For more information about the views and stored procedures available to work with Database Mail
messaging objects, see the topic Database Mail Messaging Objects in the SQL Server Technical
Documentation:
Demonstration Steps
Create a Database Mail Profile
1. In SSMS, in Object Explorer, under MIA-SQL, under Management, right-click Database Mail, and
then click Configure Database Mail.
4. On the New Profile page, in the Profile name box, type SQL Server Agent Profile, and then click
Add.
5. In the Add Account to Profile 'SQL Server Agent Profile' dialog box, click New Account.
6. In the New Database Mail Account dialog box, enter the following details, and then click OK:
8. On the Manage Profile Security page, select Public for the SQL Server Agent Profile profile, and
set its Default Profile setting to Yes, and then click Next.
10. On the Complete the Wizard page, click Finish, and when configuration is complete, click Close.
1. In Object Explorer, right-click Database Mail, and then click Send Test E-Mail.
2. In the Send Test E-Mail from MIA-SQL dialog box, ensure that the SQL Server Agent Profile
database mail profile is selected.
3. In the To box, type [email protected], and then click Send Test E-mail.
4. In File Explorer, navigate to the C:\inetpub\mailroot\Drop folder, and verify that an email message
has been created.
5. Double-click the message to view it in Outlook. When you have read the message, close it and
minimize the Drop folder window.
6. In the Database Mail Test E-Mail dialog box (which might be behind SQL Server Management
Studio), click OK.
1. In SSMS, on the File menu, point to Open, and then click Project/Solution.
2. In the Open Project dialog box, navigate to D:\Demofiles\Mod10\Demo, click Demo.ssmssln, and
then click Open.
3. In Solution Explorer, expand Queries, double-click Demo 2 - database mail.sql, review the code,
and then click Execute.
4. View the results. The first result set shows system events for Database Mail, and the second result set
shows records of email messages that have been sent.
5. Keep the solution and SQL Server Management Studio open for the next demonstration.
Question: You are troubleshooting Database Mail. You want to see a list of the email
messages that have been successfully sent and a list of email messages that could not be
sent. Where can you find this information?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-13
Lesson 3
Operators, Alerts, and Notifications
Many SQL Server systems have multiple administrators. You can use SQL Server Agent to configure
operators that are associated with one or more administrators and to determine when to contact each of
the operators—along with the method to use for that contact.
SQL Server can also detect many situations that might be of interest to administrators. You can configure
alerts that are based on SQL Server errors or on system events—such as low disk space—and then
configure SQL Server to notify you of these situations.
Lesson Objectives
After completing this lesson, you will be able to:
Configuring Operators
You can define new operators using either SSMS
or the dbo.sp_add_operator system stored procedure. After you define an operator, you can view the
definition by querying the dbo.sysoperators system table in the msdb database.
You can configure three types of contact methods for each operator:
Email. An SMTP email address where notifications are sent. Where possible, you should use group
email addresses rather than individual ones. You can also list multiple email addresses by separating
them with a semicolon.
Pager email. An SMTP email address where a message can be sent at specified times (and days)
during a week.
Note: Pager and Net send notifications are marked for deprecation, and should not be
used for new development because they will be removed in a future version of SQL Server.
Fail-Safe Operator
You can also define a fail-safe operator that is notified in the following circumstances:
The SQL Server Agent cannot access the tables that contain settings for operators and notifications in
the msdb database.
A pager notification must be sent at a time when no operators configured to receive pager alerts are
on duty.
Job Notifications
You can configure SQL Server Agent jobs to send messages to an operator on completion, failure, or
success of a job. Configuring jobs to send notifications on completing or success might lead to a large
volume of email notifications, so DBAs might prefer to be advised only if a job fails. However, for
business-critical jobs, you might want to be notified, regardless of the outcome, to remove any doubt
over the notification system itself.
For more information about creating operators, see the topic Create an Operator in the SQL Server
Technical Documentation:
Create an Operator
https://aka.ms/A7tfri
For more information about configuring notifications for the status of SQL Server Agent jobs, see the
topic Notify an Operator of Job Status in the SQL Server Technical Documentation:
Create an operator.
Configure a job to notify an operator.
Demonstration Steps
Enable a Mail Profile for SQL Server Agent
1. In Object Explorer, under MIA-SQL, right-click SQL Server Agent, and then click Properties.
2. In the SQL Server Agent Properties dialog box, on the Alert System page, select Enable mail
profile.
3. In the Mail profile drop-down list, select SQL Server Agent Profile, and then click OK.
4. In Object Explorer, right-click SQL Server Agent, and then click Restart.
6. In the Microsoft SQL Server Management Studio dialog box, click Yes.
Create an Operator
2. Select the code under the comment that begins Task 2, and then click Execute to create a new
operator called Student.
Configure a Job to Notify an Operator
1. In Object Explorer, expand SQL Server Agent, expand Jobs and view the existing jobs.
3. In the Job Properties - Back Up Database - AdventureWorks dialog box, on the Notifications
page, select E-mail.
5. In the second drop-down list, click When the job completes, and then click OK.
6. In Object Explorer, expand Operators, right-click Student, and then click Properties.
7. In the Student Properties dialog box, on the Notifications page, click Jobs, note the job
notifications that have been defined for this operator, and then click Cancel.
8. Under Jobs, right-click Back Up Database - AdventureWorks, and click Start Job at Step.
11. In the Student Properties dialog box, on the History page, note the most recent notification by
email attempt, and then click Cancel.
12. In File Explorer, in the C:\inetpub\mailroot\Drop folder, verify that a new email message has been
created.
13. Double-click the most recent file to view it in Outlook. Then, when you have read the message, close
it, and minimize the Drop window.
14. Keep the solution and SQL Server Management Studio open for the next demonstration.
Agent when events of interest occur. This callback mechanism operates efficiently because SQL Server
Agent does not need to continuously read (or poll) the Application Log to find events of interest.
When the Application Log notifies SQL Server Agent of a logged event, SQL Server Agent compares the
event to the alerts that you have defined. When SQL Server Agent finds a match, it fires the alert, which is
an automated response to an event.
Note: You must configure SQL Server Agent to write messages to the Windows Application
Event Log if they are to be used for SQL Server Agent alerts.
Alerts Actions
You can create alerts to respond to individual error numbers or to all errors of a specific severity level. You
can define the alert for all databases or for a specific database. You can also define the time delay
between responses.
Best Practice: It is considered good practice to configure notifications for all error
messages with severity level 19 and above.
System Events
In addition to monitoring SQL Server events, SQL Server Agent can also check conditions that are
detected by Windows Management Instrumentation (WMI) events. The WMI Query Language (WQL)
queries that retrieve the performance data execute several times each minute, so it can take a few seconds
for these alerts to fire. You can also configure performance condition alerts on any of the performance
counters that SQL Server exposes.
For more information about configuring alerts, see the topic Alerts in the SQL Server Technical
Documentation:
Alerts
https://aka.ms/Wixjse
Creating Alerts
You can create alerts by using SSMS or by calling
the dbo.sp_add_alert system stored procedure.
When defining an alert, you can also specify a SQL
Server Agent job to start when the alert occurs.
Using sp_add_alert
EXEC msdb.dbo.sp_add_alert
@name=N'AdventureWorks Transaction Log
Full',
@message_id=9002, @delay_between_responses=0,
@database_name=N'AdventureWorks';
GO
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-17
Logged Events
You have seen that alerts will only fire for SQL Server errors if the error messages are written to the
Windows Application Event Log. In general, error severity levels from 19 to 25 are automatically written to
the Application Log but this is not always the case. To check which messages are automatically written to
the log, you can query the is_event_logged column in the sys.messages table.
Most events with severity levels less than 19 will only trigger alerts if you perform one of the following
steps:
Modify the error message by using the dbo.sp_altermessage system stored procedure to make it a
logged message.
Raise the error in code by using the RAISERROR WITH LOG option.
Use the xp_logevent system extended stored procedure to force entries to be written to the log.
For more information about creating alerts with sp_add_alert, see the topic sp_add_alert (Transact-SQL) in
the SQL Server Technical Documentation:
sp_add_alert (Transact-SQL)
https://aka.ms/Rb92a9
Execute a Job
You can configure a SQL Server Agent job to
execute in response to an alert. If you need to start
multiple jobs, you must create a new one that
starts each of your multiple jobs in turn, and then
configure an alert response to run the new job.
Notify Operators
You can define a list of operators to notify in response to an alert by running the
dbo.sp_add_notification system stored procedure. When sending messages to operators about alerts, it
is important to provide the operator with sufficient context so that they can determine the appropriate
action to take.
You can include tokens in the message to add detail. There are special tokens available for working with
alerts, including:
By default, the inclusion of tokens is disabled for security reasons, but you can enable it in the properties
of SQL Server Agent.
For more information about using tokens in SQL Server Agent notifications, see the topic Use Tokens in
Job Steps in the SQL Server Technical Documentation:
3. Check that the error message is written to the Application Log. For SQL Server event alerts, check
that the error message is written to the Application Log. You should also make sure that the
Application Log is configured with sufficient size to hold all the event log details.
4. Ensure that the alert is enabled. If the alert is disabled, it will not fire.
5. Check that the alert was raised. If the alert does not appear to be raised, make sure that the setting
for the delay between responses is not set to too high a value.
6. Check if the alert was raised, but no action was taken. Check that the job is configured to respond
to the alert functions as expected. For operator notifications, check that Database Mail is working and
that the SMTP server configuration is correct. Test the Database Mail profile that is sending the
notifications by manually sending mail from the profile used by SQL Server Agent.
Create an alert.
Test an alert.
Demonstration Steps
Create an Alert
1. In Object Explorer, under SQL Server Agent, right-click Alerts, and then click New Alert.
2. In the New Alert dialog box, on the General page, in the Name box, type Log Full Alert.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-19
3. In the Type drop-down list, note that you can configure alerts on WMI events, performance monitor
conditions, and SQL Server events, and then click SQL Server event alert.
4. Click Error number, and in the box, type 9002 (which is the error number raised by SQL Server when
a database transaction log is full).
5. On the Response page, select Notify operators, and then select the E-mail check box for the
Student operator.
6. On the Options page, under Include alert error text in, select E-mail, and then click OK.
Test an Alert
1. In Solution Explorer, double-click Demo 4 - alerts.sql, and then click Execute. Wait while the script
fills a table in the TestAlertDB database. When the log file for that database is full, error 9002 occurs.
2. In Object Explorer, expand Alerts, right-click Log Full Alert, and then click Properties.
3. In the ‘Log Full Alert’ alert properties dialog box, on the History page, note the Date of last alert
and Date of last response values, and then click Cancel.
4. In File Explorer, in the C:\inetpub\mailroot\Drop folder, verify that a new email message has been
created.
5. Double-click the most recent message to view it in Outlook. Then, when you have read the message,
close it, and close the Drop window.
6. Leave the solution and SSMS open for the next demonstration.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Lesson 4
Alerts in Azure SQL Database
The alerting system used in Azure SQL Database is different from the operators, alerts, and notifications
system used by on-premises SQL Server instances. This lesson covers how to configure and work with
alerts in Azure SQL Database.
Lesson Objectives
At the end of this lesson, you will be able to:
Describe the alerting system in Azure SQL Database, and how it differs from a SQL Server instance.
Failed connections
Successful connections
CPU percentage
Deadlocks
DTU percentage
DTU limit
DTU used
Log IO percentage
Data IO percentage
Sessions percentage
Workers percentage
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-21
Note: Metrics can be added to or removed from this list when Azure SQL Database features
are added, removed, or changed. There is no facility to use Database Engine error messages as
the basis for an alert outside the list of predefined metrics.
After a metric is selected, you configure an alert threshold and a reporting interval, and a list of email
addresses to notify. You can also deliver the alert to an HTTP or HTTPS endpoint using Azure webhooks.
Note: Azure alerts are sent from within the Azure cloud. If network connectivity is not
available between your location and the Azure cloud—either because of a network outage or
because of missing firewall rules—you will not receive alerts from the Azure infrastructure.
Each alert is made up of two messages: one message that indicates when the alert threshold has been
crossed, and a second message sent when the alert has resolved, and the metric has returned to a
nonalert state.
An alert name.
The condition that triggers the alert—one of greater than, greater than or equal to, less than, or less
than or equal to.
The threshold value which triggers the alert. This is used in combination with the condition to
determine when the alert is triggered.
The alert period. The frequency at which the Azure alert system polls the metric value—preconfigured
intervals between five minutes and four hours. This setting determines the greatest frequency with
which you will receive the alert.
Whether to alert owners, contributors to, and readers of the database. By default, this will notify the
administrator of the Azure subscription under which the Azure SQL Database is running.
For more information about configuring Azure alerts through the Azure portal, see the topic Create alerts
in the Azure documentation:
Create alerts
https://aka.ms/Ixkogu
For more information about adding Azure alerts using Azure PowerShell, see the command Add-
AzureRmMetricAlertRule in the Azure documentation (these alerts are not database specific):
Add-AzureRmMetricAlertRule
https://aka.ms/Mvgiio
For more information about using webhooks to deliver Azure alerts, see the topic Configure a webhook on
an Azure metric alert in the Azure documentation:
Demonstration Steps
1. Open Internet Explorer, and browse to https://portal.azure.com/.
2. Sign in to the Azure portal with your Azure pass or Microsoft account credentials.
3. In the menu, click SQL databases, and then on the SQL databases blade, click AdventureWorksLT.
6. On the Add an alert rule blade, notice that the Resource box is automatically populated with the
database name.
7. Configure the alert using the following values, and then click OK:
o Threshold: 1.
9. On the Query menu, point to Connection, and then click Change Connection.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-23
10. In the Connect to Database Engine dialog box, connect to the Azure SQL Database server hosting
your copy of the AdventureWorksLT database. (You must use the credentials you configured when
you configured the Azure SQL Database server for this connection. If you are unsure of the server
name, it is shown on the AdventureWorksLT blade of the Azure portal.)
13. In the Azure portal, wait for the Alert rules blade to update so that the rule shows a LAST ACTIVE
value other than Never, indicating that the alert was triggered. This could take several minutes and
you might need to refresh the page.
14. On the Alert rules blade, click AdventureWorksLT DTU usage alert. Notice that a line chart shows
the alert metric over time, with the threshold marked as a dotted line.
16. If you are planning to show the content of an alert email message, log into your mailbox and examine
mail delivered from Microsoft Azure Alerts.
17. Close Internet Explorer, and then close SSMS without saving any changes.
DTU percentage
CPU percentage
Blocked by Firewall
MCT USE ONLY. STUDENT USE PROHIBITED
10-24 Monitoring SQL Server with Alerts and Notifications
Objectives
After completing this lab, you will be able to:
Configure operators.
Configure alerts and notifications.
Password: Pa55w.rd
The new profile should be public, and it should be the default Database Mail profile.
2. Verify that the test email message is successfully delivered to the C:\inetpub\mailroot\Drop folder.
3. Query the dbo.sysmail_event_log and dbo.sysmail_mailitems tables in the msdb database to view
Database Mail events and email history.
Results: After this exercise, you should have configured Database Mail with a new profile named SQL
Server Agent Profile.
Results: After this exercise, you should have created operators named Student and DBA Team, and
configured the SQL Server Agent service to use the SQL Server Agent Profile Database Mail profile.
MCT USE ONLY. STUDENT USE PROHIBITED
10-26 Monitoring SQL Server with Alerts and Notifications
1. Create an Alert
2. Configure the alert to run the Backup Log - InternetSales job and send an email message that
includes the error message to the Student operator if error number 9002 occurs in the InternetSales
database.
2. Configure the Back Up Database -InternetSales and Back Up Log - InternetSales jobs to notify the
Student operator on completion.
3. Verify the job notifications assigned to the Student operator by viewing its notification properties.
Results: After this exercise, you should have created an alert named InternetSales Log Full Alert.
Also, you should have configured the Back Up Database - AWDataWarehouse, Back Up Database -
HumanResources, Back Up Database - InternetSales, and Back Up Log - InternetSales jobs to send
notifications.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 10-27
2. View the history properties of the InternetSales Log Full Alert alert to verify the most recent alert
and response.
3. Verify that notification email messages for the full transaction log error and the completion of the
Back Up Log - InternetSales job were successfully delivered to the C:\inetpub\mailroot\Drop
folder.
2. View the history properties of the Student operator to verify the most recent notification that was
sent.
3. Verify that notification email messages for the failure of the Backup Database - AWDataWarehouse
job and the completion of the Back Up Database - InternetSales job were successfully delivered to
the C:\inetpub\mailroot\Drop folder.
Results: After this exercise, you will have verified that the notifications you configured for backups of the
AWDataWarehouse, HumanResources, and InternetSales databases work as expected.
You will also verify that an alert is triggered when the transaction log of the Internet Sales database is
full.
Question: Under what circumstances would email notifications be sent to the DBA Team
operator you created?
MCT USE ONLY. STUDENT USE PROHIBITED
10-28 Monitoring SQL Server with Alerts and Notifications
Best Practice: When using Database Mail, and planning notifications and alerts in SQL
Server, consider the following best practices:
Provide limited access to the ability to send email messages from the Database Engine.
Implement a retention policy for Database Mail log and mail auditing.
Review Question(s)
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Question: You are planning to send notifications from SQL Server, and think it might be
easier to use NET SEND notifications instead of email. Why should you not do this?
MCT USE ONLY. STUDENT USE PROHIBITED
11-1
Module 11
Introduction to Managing SQL Server Using PowerShell
Contents:
Module Overview 11-1
Lesson 1: Getting Started with Windows PowerShell 11-2
Module Overview
This module looks at how to use Windows PowerShell® with Microsoft SQL Server®. Businesses are
constantly having to increase the efficiency and reliability of maintaining their IT infrastructure; with
PowerShell, you can improve this efficiency and reliability by creating scripts to carry out tasks. PowerShell
scripts can be tested and applied multiple times to multiple servers, saving your organization both time
and money.
Objectives
After completing this module, you will be able to:
Describe the benefits of PowerShell and its fundamental concepts.
Lesson 1
Getting Started with Windows PowerShell
Windows PowerShell includes a shell—or console—and a scripting language. You can use it with SQL
Server to automate common SQL Server tasks. In this lesson, you will learn about how PowerShell works
and how it supports working with SQL Server.
Lesson Objectives
After completing this lesson, you will be able to:
Explain what PowerShell is, and when you would use it.
With PowerShell, you can create scripts that work on one machine, 10 machines, or all the machines in
your data center. You can then pass the scripts to someone else to run, with the confidence that they will
obtain the same results. You cannot do that with a graphical user interface (GUI)—each time you perform
a task using the GUI, it takes about the same time; if you ask someone else to do the job, they may or may
not do it the same way. The GUI may be friendly, but it can be time consuming for frequently-run tasks.
With PowerShell, you can automate tasks using scripts, saving time and improving quality.
PowerShell cmdlets
cmdlets take the form of a verb plus a prefix noun; for example, Get-Help, or Import-Module, or Get-
Verb. Although PowerShell is case-insensitive, the code is more readable when it’s capitalized
appropriately. The prefix denotes the PowerShell module; for example, SQL Server PowerShell has a prefix
of sqlps. All the SQL Server PowerShell cmdlets can be displayed by typing Get-Command -Module
sqlps. If you want to display all the cmdlets, you type Get-Command.
PowerShell verbs are standardized to help you become familiar enough to guess commands for a given
situation.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 11-3
Note: The noun is always singular in the verb+prefixnoun syntax. This makes commands
easier to remember.
PowerShell Aliases
If the cmdlet format seems a little unfamiliar, there are some commands that anyone with command-line
or Unix experience will recognize. You can, for example, use cd to change directories, cls to clear the
screen, or dir to display a list of directories. These are not actually PowerShell commands, but aliases. You
can also think of them as shortcuts to some common commands. You can display a list of aliases by
typing Get-Alias.
PowerShell Console
You can start PowerShell either from the taskbar or directly from Object Explorer in SQL Server
Management Studio. Whichever method you use, the PowerShell console opens.
Note: Many cmdlets will not run unless you run PowerShell as Administrator. When you
start the PowerShell console as Administrator, the top title bar reads: Administrator: Windows
PowerShell. If you see a different title, you know you have forgotten to run in Administrator
mode.
PowerShell Help
A good place to start when learning PowerShell is
the Get-Help cmdlet. This cmdlet displays a page
that explains how to get more information,
including downloading or updating help files.
Get-Help Cmdlet
When you understand how to use PowerShell help, learning PowerShell becomes a lot easier. You have
already seen that the cmdlet, Get-Help, displays a summary of the help available in PowerShell. Within this
summary, you will learn how to access help for specific cmdlets.
MCT USE ONLY. STUDENT USE PROHIBITED
11-4 Introduction to Managing SQL Server Using PowerShell
Get-Help
Get-Help Get-Item
The Get-Help cmdlet returns the name of the specified cmdlet, a synopsis of what it does, the syntax,
description, related links, and remarks. Within the syntax section, there might be one or more syntax
sections because a cmdlet can take different parameters. For example, Get-Item may have -Path, -
Credential, or -Stream in position one.
Get-Alias
The Get-Alias cmdlet displays a list of all aliases, including Gal, which is an alias for Get-Alias. Use Get-
Alias to explore shortcuts to commonly used cmdlets; for example, Get-Alias Get-Help.
Wildcards
You can use wildcards to discover cmdlets related to a particular task. If you want to display all the
cmdlets that start with the Get verb, type Get-Help Get*. If you want to display all the cmdlets related to
SQL Server, type Get-Help *SQL* or Get-Help *SQLServer*. You can discover the cmdlets related to
Azure by typing Get-Help *Azure*.
Tab Completion
Using tab completion, you can type the first part of a cmdlet, and PowerShell will complete it for you. It
does not just complete the cmdlet, but also capitalizes it for you. Furthermore, you can type part of a
cmdlet, and then press TAB repeatedly to see all the cmdlets that begin in that way.
If you type Get-I and press TAB repeatedly, you will see all the cmdlets that start with Get-I, such as Get-
Item, Get-ItemProperty, Get-InitiatorId. Tab completion saves time, saves mistakes, and makes it
possible for you to see the range of cmdlets starting with a particular verb. If you mistakenly go past the
cmdlet you want, use SHIFT+TAB to go back.
Explore Further
With help, you can work faster and do more with PowerShell. It is worth spending some time to
understand the PowerShell help system, and practice using it. Then when you need to do something, you
will know how to explore the cmdlets and find what you want.
Note: The cmdlet Get-Verb lists all PowerShell verbs. Type Get-Help Get-Verb for more
information.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 11-5
PowerShell Modules
PowerShell commands are installed in modules or
snap-ins. Each cmdlet is made up of a verb-prefix-
noun. The prefix denotes the module, so sqlps is
the prefix for SQL Server cmdlets, and azurerm is
the prefix for Azure Resource Manager cmdlets.
There are also modules for Active Directory (ad),
FTP (psftp), file security (acl), and many more.
PowerShell loads the modules it requires into the active PowerShell console, either when you type a
cmdlet, or explicitly when you type import-module sqlps.
Console or ISE
You can use PowerShell in one of two ways, either through the console or the PowerShell ISE (Integrated
Scripting Environment).
The console is started from the taskbar, and can be used to interact directly with the remote server by
typing cmdlets and viewing the information that is returned. Use the console when you want information
back from the server, and for testing scripts.
The ISE is designed to create, edit, debug, and run scripts. It can be started from the Windows
PowerShell group on the Start menu. You can also run cmdlets in the lower pane of the ISE.
PowerShell Scripts
PowerShell scripts are saved with a .ps1 extension. PowerShell scripts must be opened explicitly: they do
not respond to double-clicking.
PowerShell Variables
Variables are identified with a $ symbol, and may be loosely or strongly typed. That is to say, you may
either declare a variable that will accept any type of data, such as text or numbers—or you can declare a
variable that will only take one data type. The advantage of strongly typed variables—which only accept
one data type—is that you can be more confident of results. Loosely typed variables may seem more
flexible, but are a common source of bugs in systems, and can be difficult to spot.
You declare a variable by typing $variablename, where variable name is the name of your variable. You
then assign a value using the equal sign. If you want to declare a variable called DatabaseName and
assign it a value of AdventureWorks2016, type the following:
There are different data types you can use, including the following common types:
[int] – integers
[string] – characters
SMO Namespaces
SMO is object-oriented and so organizes its classes
in namespaces. The classes used to manipulate
server and database settings are in the SMO
namespace. Other namespaces are SMO.Agent, which represents SQL Server Agents, and SMO.Mail,
which represents Database Mail. For a list of all SMO namespaces, together with links to their classes, see
the SQL Server Technical documentation:
Note: The SMO object model follows a similar hierarchical structure to that shown by
Object Explorer in SSMS.
PowerShell Providers
PowerShell providers offer a method of navigating
to, and accessing objects or data at, a certain
location. Consider File Explorer, a familiar utility
that provides a way of accessing data at certain
locations on a disk. PowerShell providers are more
flexible—you can use them to access not only
locations such as the file system, but also other
locations, such as the Windows registry. You can
use providers in PowerShell to access objects or
data.
Get-PSProvider
The Get-PSProvider cmdlet displays a list of
PowerShell providers and their associated drives.
To change to a location, use the Set-Location cmdlet (or its alias cd) and type the PSDrive name
followed by a colon:
Using Set-Location
Set-Location Env:
When you import a module, such as SQLPS, the SQL provider is also loaded and available.
Provider cmdlets
Provider cmdlets are used to access the data or objects. They include cmdlets such as Get-PSProvider,
Get-PSDrive to return the current drive, or Get-Item, which retrieves data or objects from a location.
Additional Reading: To learn more about Windows PowerShell, see Learn Windows
PowerShell in a Month of Lunches (Don Jones and Jeffery D. Hicks). It is published by Manning
and is available as an e-book.
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
2. Run Setup.cmd in the D:\Demofiles\Mod11 folder as Administrator.
4. On the taskbar, right-click Windows PowerShell, and then click Run as Administrator.
6. In the PowerShell console, verify that the title bar reads Administrator: Windows PowerShell.
9. At the command prompt, type Get-Alias c*, and then press Enter.
10. At the command prompt, type Update-Help -Force -ErrorAction SilentlyContinue, and then press
Enter. Wait for the help files to install.
11. At the command prompt, type Get-Help, and then press Enter.
12. At the command prompt, type Get-Help Get-Help, and then press Enter.
13. At the command prompt, type Get-Help Get-Item, and then press Enter.
14. For each of the following cmdlets, at the command prompt, type the cmdlets, and then press Enter to
show the different sets of parameters:
Get-Help Get-Item -Examples
Get-Help Get-Item -Detailed
Get-Help Get-Item -ShowWindow
15. Close the Get-Item Help window.
16. At the command prompt, type Get-I, and then press TAB.
17. Press TAB repeatedly to show all the cmdlets that start with Get-I.
18. Press SHIFT+TAB to step backwards. Note how the capitalization is automatically corrected.
20. At the command prompt, type Get-PSProvider, and then press Enter.
21. At the command prompt, type Get-PSDrive, and then press Enter.
22. At the command prompt, type Import-Module SQLPS, and then press Enter.
23. Repeat steps 20 and 21 and note the additions to the list.
24. At the command prompt, type Set-Location SQLSERVER:\ or cd SQLSERVER:\, and then press Enter.
25. At the command prompt, type Get-ChildItem, and then press Enter.
26. At the command prompt, type Set-Location SQL, and then press Enter.
27. At the command prompt, type Get-ChildItem, and then press Enter.
28. At the command prompt, type Set-Location MIA-SQL, and then press Enter.
29. At the command prompt, type Get-ChildItem, and then press Enter.
30. At the command prompt, type Set-Location Default, and then press Enter.
31. At the command prompt, type Get-ChildItem, and then press Enter.
32. Review the list of objects and consider how they map to objects in SQL Server.
33. At the command prompt, type Set-Location Databases, and then press Enter.
34. At the command prompt, type Get-ChildItem, and then press Enter.
35. At the command prompt, type Exit, and then press Enter.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 11-9
A PowerShell variable.
Lesson 2
Configure SQL Server Using PowerShell
You can use PowerShell in conjunction with SMO objects to view and amend settings, either from the
PowerShell console, or from within a PowerShell script; you can build scripts that automate tasks, and
apply them on many machines—or many times.
Lesson Objectives
After completing this lesson, you will be able to:
Declare Variables
The following example shows how you can use a PowerShell provider to access the SMO objects. Two
variables are created: $databaseName and $database. You should assign the string value
AdventureWorks2016 to $databaseName, and then use the variable to retrieve the SMO database object
and assign it to the variable $database.
When you have assigned the SMO database object for AdventureWorks2016 to a variable, you can then
display properties such as Name and DataSpaceUsage. These are displayed in the console using the Write-
Host cmdlet.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 11-11
Get-PSProvider;
Get-PSDrive;
Set-Location SqlServer:
Set-Location \SQL\localhost\DEFAULT\Databases;
$databaseName =”AdventureWorks2016”;
$database = Get-Item -Path $databaseName;
Write-host (“Database name:” ($database.Name);
Write-host(“Space used:”, ($database.DataSpaceUsage);
Alter Method
To amend the settings, you should get the
property you are interested in, and assign it a new
value. To make the change permanent, use the
Alter method.
Get-Member
But how did you know that CompatibilityLevel was a property of the database? And how did you know
that the database object had an Alter method?
MCT USE ONLY. STUDENT USE PROHIBITED
11-12 Introduction to Managing SQL Server Using PowerShell
Get-Member is the equivalent of Get-Help, but it applies to objects, rather than cmdlets. When you type
$database | Get-Member, PowerShell displays a list of the object’s properties, methods, and events. With
Get-Member you can find out how to work with any object.
Note: Get-Member uses the pipe symbol | which is represented as two vertical lines on the
keyboard.
Note: If you try to use Get-Member with something that is not an object, you will receive
an error message.
Settings Objects
In the same way as you can access the properties, methods, and events of other objects, you can use the
display to amend the properties for settings. The Get-Member cmdlet displays all available properties of
the settings object.
$server.Settings
$server.Settings | Get-Member -MemberType Property;
LoginMode
Login mode is one of the properties exposed by the settings object, and is normally set on the Security
page of the Properties dialog box in SSMS. It can either be Windows Authentication, or SQL Server and
Windows Authentication.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 11-13
You can use the Settings object to amend the LoginMode property:
$server.Settings.LoginMode
$server.Settings.LoginMode = [Microsoft.SqlServer.Management.Smo.ServerLoginMode]::Mixed;
$server.Settings.Alter();
$server.Settings.LoginMode;
PowerShell ISE
The PowerShell ISE is designed to develop
PowerShell scripts. It has two sections: a scripting
pane where you write the script that will be saved,
and the console where you can write cmdlets and
display results. The console window behaves just
like the stand-alone console; you can use it to test
PowerShell scripts, get help about cmdlets, and
make sure everything works as expected.
Tools, Options
You can customize ISE settings such as font size, window color, and IntelliSense. Click Tools, and then
select Options to view and amend settings. It is worth getting, for example, font and font size correct for
you—to make the tool easier to work with.
Toolbar
The toolbar contains many familiar icons including new, open, and save. It also includes two icons used
to run script: the Run icon, which is a large solid green triangle, and the Run Selection icon, which is a
smaller green triangle in front of a file symbol. The Run icon runs the entire script, and the Run Selection
icon, runs only the code you have highlighted.
Sequencing Activity
Put the following steps in order by numbering each to indicate the correct order.
Steps
Amend a property.
Lesson 3
Administer and Maintain SQL Server with PowerShell
With PowerShell, you can manage a number of different Windows servers running SQL Server, using a
common scripting language. This means you can automate tasks that cross operating system and SQL
Server boundaries. In this lesson, you will explore a number of different tasks that introduce you to using
PowerShell to administer and maintain SQL Server.
Lesson Objectives
After completing this lesson, you will be able to:
Smo.DatabaseRole
$rolename = "MyRole";
# List the roles and make sure the new role has been created
$database.Roles;
Smo.ObjectPermissionSet
# The permission is passed in to the constructor of the ObjectPermissionSet object
$permset1 = New-Object
Microsoft.SqlServer.Management.Smo.ObjectPermissionSet([Microsoft.SqlServer.Management.Sm
o.ObjectPermission]::Select);
# Grant this permission set to the role you just created, so that
# it can perform select on objects in the Sales schema
$salesschema = $database.Schemas["Sales"];
$salesschema.Grant($permset1, $rolename);
Add a User
To add a user, you use the Smo object User, and then call the Create method.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 11-17
Add a user:
Smo.User
$loginname = "AW\Gragg";
$username = "Marshall";
$user.Login = $loginname;
$user.Create();
# List the database users and make sure the new user has been created
$database.Users;
This topic has introduced using the SMO hierarchy; you can access objects and properties to create and
amend users, and database roles.
Additional Reading: For more information about automating SQL tasks, see SQL Server
2014 with PowerShell v5 Cookbook (Donabel Santos, Packt Publishing).
Beyond Transact-SQL
You can use PowerShell in all sorts of different
ways. For example, Transact-SQL works well for
tasks within a database, but does not work across
SQL Server instances. Yet there are times when
you need an overview of SQL Server instances,
perhaps to see the disk space being used, or logins that are members of the sysadmin server role.
PowerShell Pipes
The PowerShell pipe is the | symbol. It is used to send objects from one cmdlet to another, as if they were
being sent down a pipeline. To display information for several SQL Server instances, you use the Get-
ChildItem cmdlet with pipes to display data about SQL Server instances.
Get-ChildItem
Get-ChildItem is used with the SQL Server provider to access the SMO, and retrieve information about
SQL Server instances. By filtering and formatting the data, you can retrieve a useful snapshot across
instances, and machines.
MCT USE ONLY. STUDENT USE PROHIBITED
11-18 Introduction to Managing SQL Server Using PowerShell
Backup-SqlDatabase
The Backup-SqlDatabase is the cmdlet used to
back up databases. It is a flexible cmdlet with
seven parameter sets, meaning you can choose
from a number of different ways of specifying the
backup. These include the database to be backed
up, the backup location, log truncation,
encryption, and much more.
Backup-SqlDatabase
# Get the server object
# This is an SMO object and you can access its properties and methods
$server = Get-Item -Path DEFAULT;
$serverName = $server.Name;
As the database is being backed up, PowerShell shows a progress bar to indicate the backup is running.
Restore-SqlDatabase
Unsurprisingly, there is an equivalent cmdlet to restore databases, namely Restore-SQLDatabase.
Windows Updates
Windows Updates are crucial in keeping systems secure and up to date, but can also be a source of errors.
Creating a PowerShell script that produces a list of Windows Updates, formatted and filtered, is a quick
way of obtaining information about possible problem areas.
Get-HotFix
Get-HotFix is a PowerShell cmdlet that displays information about hotfixes that have been applied to local
or remote machines. It takes parameters with which you can report on just one machine.
Hotfixes, as their name suggests, provide fixes to an urgent problem—this is a subset of Windows
Updates. As such, the Get-HotFix cmdlet does not report all Windows Updates. Find out more about Get-
HotFix by typing Get-Help Get-HotFix in the PowerShell console.
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
2. On the taskbar, right-click Windows PowerShell, and then click Run ISE as Administrator.
7. Select the code under the #2# comment, and then on the toolbar, click Run Selection to set the
location.
MCT USE ONLY. STUDENT USE PROHIBITED
11-20 Introduction to Managing SQL Server Using PowerShell
8. Select the code under the #3# comment, and then on the toolbar, click Run Selection to display the
instances of SQL Server.
9. Select the code under the #4# comment, and then on the toolbar, click Run Selection to display a
formatted list of SQL Server instances.
10. Select the code under the #5# comment, and then on the toolbar, click Run Selection to display a list
of databases in descending order of size.
11. Select the code under the #6# comment, and then on the toolbar, click Run Selection to display a
tabular list of databases in descending order of size.
12. Select the code under the #7# comment, and then on the toolbar, click Run Selection to output the
information to a text file.
13. Select the code under the #8# comment, and then on the toolbar, click Run Selection to output the
information to an XML file.
14. Select the code under the #9# comment, and then on the toolbar, click Run Selection to output the
information to an Excel file.
Lesson 4
Managing Azure SQL Databases Using PowerShell
You can use Azure PowerShell to automate a range of different tasks in Azure. Although the Azure Portal
is friendly and straightforward to use, it can be time consuming if you have repetitive tasks. With Azure
PowerShell, you can set up virtual machines, and automate aspects of your Azure work.
This lesson looks at how to use Azure PowerShell to connect to an existing subscription, create a new
server, and then create an Azure SQL Database.
Lesson Objectives
After completing this lesson, you will be able to:
Azure PowerShell
https://aka.ms/Kunm8d
Add your Azure account so it can be used with Azure Resource Manager.
Specify a subscription:
Select-AzureRmSubscription
Select-AzureRmSubscription -SubscriptionID 927sh-ld924yh-sldhfhsk1
You are now ready to work with Azure PowerShell, using your Azure account, and your subscription.
Create a Server
Before you can create an Azure SQL Database, you
must create a server for the database. This is done
using the New-AzureRmSqlServer cmdlet. When
you create a new server on Azure, you must
specify the data center and the resource group
you want to use.
The resource group is a way of keeping resources
together. You must create the resource group at
the same location that you want to host your
Azure SQL Database.
List the Azure Data Centers that support Azure SQL Database.
Get-AzureRmResourceProvider
(Get-AzureRmResourceProvider -ListAvailable | Where-Object {$_.ProviderNamespace -eq
'Microsoft.Sql'}).Locations
New-AzureRmResourceGroup
New-AzureRmResourceGroup -Name "PowerShellTest" -Location "Japan East"
New-AzureRmSqlServer
New-AzureRmSqlServer -ResourceGroupName "PowerShellTest" -ServerName "pstest201604ce" -
Location "Japan East" -ServerVersion "12.0"
You will be prompted to add credentials with a user name and password.
Note: Although you have to create a server for your Azure SQL Database, you do not have
access to the server. You cannot log on to the server, or amend any settings. The server is
required to host Azure SQL Database.
New-AzureRmSqlServerFirewallRule
New-AzureRmSqlServerFirewallRule -ResourceGroupName "PowerShellTest" -ServerName
"pstest201604ce" -FirewallRuleName "myFirewallRule" -StartIpAddress “nnn.nnn.nnn.nnn” -
EndIpAddress “nnn.nnn.nnn.nnn”
When the firewall rule has been successfully created, confirmation details are returned to the PowerShell
console.
New-AzureRmSqlDatabase
New-AzureRmSqlDatabase -ResourceGroupName "PowerShell" -ServerName "pstest201604ce" -
DatabaseName "testpsdb" -Edition Standard -RequestedServiceObjectiveName "S1"
MCT USE ONLY. STUDENT USE PROHIBITED
11-24 Introduction to Managing SQL Server Using PowerShell
Get-
New-
Remove-
Set-
Start-
Stop-
Suspend-
Use-
You have already used some of these cmdlets in learning how to provision a new Azure SQL Database. For
a list of Azure SQL Database cmdlets, see the MSDN:
Demonstration Steps
Install the AzureRM PowerShell Module
2. On the taskbar, right-click Windows PowerShell, and then click Run as Administrator.
4. At the command prompt, type Install-Module AzureRM, and then press Enter.
5. At the NuGet provider is required to continue message, type Y, and then press Enter.
7. Wait until the installation completes, and then close the PowerShell window.
1. At the command prompt, type Link your Azure account to PowerShell, typing the following cmdlet,
and then press Enter:
Add-AzureRmAccount
3. At the Azure sign-on screen, type the user name and password you use to sign in to the Azure portal,
and then click Sign in.
4. At the command prompt, to link your Azure account to PowerShell on this VM, type the following
cmdlet:
Login-AzureRmAccount
5. Wait for the Azure sign to appear on screen, type the user name and password you use to sign in to
the Azure portal, and then click Sign in.
6. At the command prompt, type the following cmdlet, and then press Enter. Substituting the
SubscriptionID that was returned in the previous step for <yoursubscriptionid>:
7. At the command prompt, type the following cmdlet to return a list of Azure data center locations,
and then press Enter:
8. At the command prompt, type the following cmdlet to create a resource group, and then press Enter.
Substitute a location from the list returned in the previous step for <location>:
9. At the command prompt, type the following cmdlet to create a new server in the resource group you
just created, and then press Enter. Substitute the location used in the previous step for <location>.
Substitute a unique server name for <your server name>. This must be unique throughout the whole
Azure service, so cannot be specified here. A suggested format is sql2016ps-<your initials><one or
more digits>. For example, sql2016ps-js123. Letters must be lowercase.
10. In the Windows PowerShell credential request dialog box, in the User name box, type psUser, in
the Password box, type Pa55w.rd, and then click OK. Wait for Azure to create the administrator for
your server and for the console to display the information.
11. At the command prompt, type the following cmdlet to create a variable to store your external IP
address and then press Enter. Substitute your own relevant information for the <your external ip>
parameter:
Note: You can get your current external IP address from the Azure Portal (see the value
returned by the "Add Client IP" button on the firewall for an existing server), or from third-party
services such as www.whatismyip.com.
12. At the command prompt, type the following cmdlet to create a firewall rule that permits you to
connect to the server, and then press Enter. Substitute your own relevant information for the <your
server name> parameter:
13. At the command prompt, type the following cmdlet to create an Azure SQL Database on the server
you have just created, and then press Enter. Substitute the name of your server for <your server
name>:
This will take a few minutes to complete. Wait for the details of the new database to be returned—
this indicates that the database has been created.
Objectives
After completing this lab, you will be able to:
Use PowerShell to change database settings and SQL Server instance settings.
Estimated Time: 30 minutes
Password: Pa55w.rd
7. List the available PSProviders again, noting the new entry in the list.
8. List the available PSDrives again, noting the new entry in the list.
Results: After completing this exercise, you will have investigated PowerShell help and the SQL
PowerShell provider.
12. Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
True or false?
PowerShell
providers
offer an
alternative to
using SQL
Server
Management
Objects.
2. Open D:\Labfiles\Lab11\Starter\DisplayProperties.ps1.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 11-29
3. Select and run the code to import the SQL PowerShell module and test that it has been loaded by
typing Get-PSProvider in the console window.
4. Select and run the code to set the location. Check that the directory has been changed correctly.
5. Select and run the code to assign the database object to the $database variable. Type Write-Host
$database.Name to show the name of the object. To check it is an object, type $database | Get-
Member.
6. Select and run the code to display some of the database properties.
7. Select and run the code to display some of the database options.
2. Select and run the code to import the SQL PowerShell module.
5. Select and run the code to change the compatibility level of the AdventureWorks2016 database.
6. Select and run the code to change some of the database options.
2. Select and run the code to import the SQL PowerShell module.
4. Select and run the code to assign the server object to the $server variable.
5. Select and run the code to view properties of the server object, and then to display the properties as a
formatted list.
6. Select and run the code to change the login mode for the server.
7. Select and run the code to change the login mode back to integrated.
10. Select and run the code to reset some server settings back to their original values.
Results: After completing this lab exercise, you will have PowerShell scripts to show the IT Director.
MCT USE ONLY. STUDENT USE PROHIBITED
11-30 Introduction to Managing SQL Server Using PowerShell
Question: Can you name three ways of getting information about a cmdlet?
PowerShell scripts can be run repeatedly, with predictable results, making SQL Server tasks faster and
reducing the number of problems.
Best Practice: Use aliases when working with the console window, but make scripts easy to
read by using correctly capitalized cmdlet names.
It takes a little time to understand how PowerShell works, but the investment will pay off when tasks have
to be run repeatedly, or when you have to be certain that tasks have been fully tested.
The efficiency improvements that can be gained by automating tasks are immense, and will be well worth
the investment in learning to use PowerShell effectively.
Review Question(s)
Question: What tasks might benefit from automating with PowerShell for your SQL Server
environment?
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
12-1
Module 12
Tracing Access to SQL Server with Extended Events
Contents:
Module Overview 12-1
Lesson 1: Extended Events Core Concepts 12-2
Module Overview
Monitoring performance metrics provides a great way to assess the overall performance of a database
solution. However, there are occasions when you need to perform more detailed analysis of the activity
occurring within a Microsoft® SQL Server® instance—to troubleshoot problems and identify ways to
optimize workload performance.
SQL Server Extended Events is a flexible, lightweight event-handling system built into the Microsoft SQL
Server Database Engine. This module focuses on the architectural concepts, troubleshooting strategies and
usage scenarios of Extended Events.
Note: Extended Events has been available in Microsoft Azure® SQL Database as a preview
feature since October, 2015; at the time of publication, no date has been published for the
General Availability (GA) of Extended Events in Azure SQL Database.
For more information about Extended Events in Azure SQL Database, see:
Objectives
After completing this module, you will be able to:
Lesson 1
Extended Events Core Concepts
This lesson focuses on the core concepts of Extended Events—the architectural design of the Extended
Events engine, and core concepts of Extended Events are covered in depth.
Lesson Objectives
After completing this lesson, you will be able to:
Explain the differences between SQL Server Profiler, SQL Trace, and Extended Events.
SQL Trace
SQL Trace is a server-side, event-driven activity
monitoring tool; it can capture information about
more than 150 event classes. Each event returns
data in one or more columns and you can filter
column values. You configure the range of events
and event data columns in the trace definition;
you can configure the destination for the trace
data, a file or a database table, in the trace
definition.
SQL Trace is included was SQL Server 7.0 and later versions.
As established parts of the SQL Server platform, SQL Server Profiler and SQL Trace are familiar to many
SQL Server administrators.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 12-3
Note: SQL Trace and SQL Server Profiler have been marked for deprecation. SQL Server
includes SQL Server Profiler and SQL Trace, but Microsoft intends to remove both tools in a
future version of SQL Server. Extended Events is now the activity tracing tool recommended by
Microsoft.
Extended Events
Extended Events was introduced in SQL Server 2008. Like SQL Trace, Extended Events is an event-driven
activity monitoring tool; however, it attempts to address some of the limitations in the design of SQL
Trace by following a loose-coupled design pattern. Events and their targets are not tightly coupled; any
event can be bound to any target. This means that data processing and filtering can be carried out
independently of data capture, which, in most cases, results in Extended Events having a lower
performance overhead than an equivalent SQL Trace.
With Extended Events, you can define sophisticated filters on captured data. In addition to using value
filters, you can filter events by sampling and data can be aggregated at the point it is captured. You can
manage Extended Events either through a GUI in SQL Server Management Studio (SSMS) or by using
Transact-SQL statements.
You can integrate Extended Events with the Event Tracing for Windows (ETW) framework, so that you can
monitor SQL Server activity alongside other Windows® components.
The additional flexibility of Extended Events comes at the cost of greater complexity compared to the
previous tools.
Packages
Packages act as containers for the Extended Events
objects and their definitions; a package can
expose any of the following object types:
Events
Predicates
Actions
Targets
Types
Maps
Packages are contained in a module that exposes them to the Extended Events engine. A module can
contain one or more packages, and can be compiled as an executable or DLL file.
A complete list of packages registered on the server can be viewed using the sys.dm_xe_packages DMV:
sys.dm_xe_packages
SELECT * FROM sys.dm_xe_packages;
For more information on sys.dm_xe_packages, see the topic sys.dm_xe_packages (Transact-SQL) in the
SQL Server Technical Documentation:
sys.dm_xe_packages (Transact-SQL)
https://aka.ms/Lonpcq
Events
Events are points in the code of a module that are
of interest for logging purposes. When an event
fires, it indicates that the corresponding point in
the code was reached. Each event type returns
information in a well-defined schema when it
occurs.
All available events can be viewed in the
sys.dm_xe_objects DMV under the event
object_type:
sys.dm_xe_objects; events
SELECT * FROM sys.dm_xe_objects
WHERE object_type = 'event';
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 12-5
Events are defined by the Event Tracing for Window model—this means that SQL Server Extended Events
can be integrated with ETW. Like ETW events, Extended Events are categorized by:
Channel. The event channel identifies the target audience for an event. These channels are common
to all ETW events:
Keyword. An application-specific categorization. In SQL Server, Extended Events event keywords map
closely to the grouping of events in a SQL Trace definition.
When you add, amend or remove an event from a package, you must refer to it with a two-part name:
package name.event name.
A complete list of events and their package names can be returned by joining the list of events returned
by the first example in this lesson to sys.dm_xe_packages:
To find all the attribute columns associated with an event, you should join sys.dm_xe_objects to
sys.dm_xe_object_columns:
For more information on sys.dm_xe_objects, see the topic sys.dm_xe_objects (Transact-SQL) in the SQL
Server Technical Documentation:
sys.dm_xe_objects (Transact-SQL)
https://aka.ms/Uhmmhw
MCT USE ONLY. STUDENT USE PROHIBITED
12-6 Tracing Access to SQL Server with Extended Events
Predicates
Predicates are logical rules with which events can
be selectively captured, based on criteria you
specify. Predicates divide into two subcategories:
In addition to building logical rules, predicates are capable of storing data in a local context that means
that predicates based on counters can be constructed; for example, every n events or every n seconds.
Predicates are applied to an event using a WHERE clause that functions like the WHERE clause in a
Transact-SQL query.
All available predicates can be viewed in the DMV sys.dm_xe_objects under the object_type values
pred_source and pred_compare:
sys.dm_xe_objects; predicates
SELECT * FROM sys.dm_xe_objects
WHERE object_type LIKE 'pred%'
ORDER BY object_type, name;
Actions
Actions are responses to an event; you can use
these responses to collect supplementary
information about the context of an event at the
time an event occurs. Each event may have a
unique set of one or more actions associated with
it. When an event occurs, any associated actions
are raised synchronously.
Collect database ID
Collect session ID
All available actions can be viewed in the DMV sys.dm_xe_objects, under the object_type value action:
sys.dm_xe_objects; actions
SELECT * FROM sys.dm_xe_objects
WHERE object_type = 'action';
Targets
Targets are the Extended Events objects that
collect data. When an event is triggered, the
associated data can be written to one or more
targets. A target may be updated synchronously or
asynchronously. The following targets are available
for Extended Events:
Event counter. The counter is incremented
each time an event associated with a session
occurs. Synchronous.
Event file. Event data is written to a file on
disk. Asynchronous.
Event pairing. Tracks when events that normally occur in pairs (for example, lock acquired and lock
released) do not have a matching pair. Asynchronous.
Event Tracing for Windows. Event data is written to an ETW log. Synchronous.
Histogram. A more complex counter that partitions counts by an event or action value.
Asynchronous.
Ring buffer. A first-in, first-out (FIFO) in-memory buffer of a fixed size. Asynchronous.
The design of Extended Events is such that an event will only be written once to a target, even if multiple
sessions are configured to send that event to the same target.
All available targets can be viewed in the DMV sys.dm_xe_objects under the object_type value target:
sys.dm_xe_objects; targets
SELECT * FROM sys.dm_xe_objects
WHERE object_type = 'target';
MCT USE ONLY. STUDENT USE PROHIBITED
12-8 Tracing Access to SQL Server with Extended Events
Sessions
A session links one or more events to one or more
targets. You can configure each event in a session
to include one or more actions, and to be filtered
with one or more predicates. Once defined, a
session can be started or stopped as required; you
can configure a session to start when the database
engine starts.
A session is configured with a buffer in which event data is held while a session is running, before it is
dispatched to the session targets. The size of this buffer is configurable, as is a dispatch policy (how long
data will be held in the buffer). You can also configure whether or not to permit data loss from the buffer
if event data arrives faster than it can be processed and dispatched to the session target.
All active Extended Events sessions can be viewed in the DMV sys.dm_xe_sessions:
sys.dm_xe_sessions
SELECT * FROM sys.dm_xe_sessions;
For more information on the set of DMVs for accessing information about active Extended Events
sessions, including sys.dm_xe_sessions, see the topic Extended Events Dynamic Management Views in the
SQL Server Technical Documentation:
Extended Events Dynamic Management Views
https://aka.ms/Jiinp1
All Extended Events sessions defined on a server can be returned by querying the DMV
sys.server_event_sessions:
sys.server_event_sessions
SELECT * FROM sys.server_event_sessions;
For more information on the set of DMVs for accessing definitions for all Extended Events sessions,
including sys.server_event_sessions, see the topic Extended Events Catalog Views (Transact-SQL) in the
SQL Server Technical Documentation:
Note: A session can be created without targets, in which case the session data will only be
visible through the Watch Live Data feature of SSMS.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 12-9
Types
Internally, Extended Events data is held in binary.
A type identifies how a binary value should be
interpreted and presented when the data is
queried.
All available types can be viewed in the DMV
sys.dm_xe_objects under the object_type value
type:
sys.dm_xe_objects; types
SELECT * FROM sys.dm_xe_objects
WHERE object_type = 'type';
Maps
A map is a lookup table for integer values. Internally, many event and action data values are stored as
integers; maps link these integer values to text values that are easier to interpret.
sys.dm_xe_map_values
SELECT * FROM sys.dm_xe_map_values
ORDER BY name, map_key;
Demonstration Steps
1. Ensure the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines are
running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the password
Pa55w.rd.
3. In the User Account Control dialog box, click Yes and wait for the script to finish.
4. Start SQL Server Management Studio and connect to the MIA-SQL database engine instance using
Windows authentication.
6. In the Open Project dialog box, navigate to the D:\Demofiles\Mod12 folder, click Demo.ssmssln,
and then click Open.
7. In Solution Explorer, expand Queries, and then double-click Demo 1 - create xe session.sql.
MCT USE ONLY. STUDENT USE PROHIBITED
12-10 Tracing Access to SQL Server with Extended Events
8. Select code under the comment that begins Step 1, and then click Execute to create an Extended
Events session.
9. Select code under the comment that begins Step 2, and then click Execute to verify that the session
metadata is visible.
10. Select code under the comment that begins Step 3, and then click Execute to start the session and
execute some queries.
11. Select code under the comment that begins Step 4, and then click Execute to query the session data.
12. Select code under the comment that begins Step 5, and then click Execute to refine the session data
query.
13. In Object Explorer, under MIA-SQL, expand Management, expand Extended Events, expand
Sessions, expand SqlStatementCompleted, and then double-click package0.ring_buffer.
14. In the Data column, click the XML value, and note that this is the same data that is returned by the
query under the comment that begins Step 4 (note that additional statements will have been
captured because you ran the code earlier).
15. In Object Explorer, right-click SqlStatementCompleted, and then click Watch Live Data.
16. In the Demo 1 - create xe sessions.sql query pane, select the code under the comment that begins
Step 7, and then click Execute to execute some SQL statements.
17. Return to the MIA-SQL - SqlStatementCompleted: Live Data pane. Wait for the events to be
captured and displayed; this can take a few seconds. Other SQL statements from background
processes might be captured by the session.
18. If the results do not appear, repeat steps 16 and 17.
19. In the Demo 1 - create xe sessions.sql query pane, select the code under the comment that begins
Step 8, and then click Execute to stop the session.
20. In Object Explorer, right-click SqlStatementCompleted, and then click Properties.
21. In the Session Properties dialog box, review the settings on the General, Events, Data Storage and
Advanced pages, if necessary referring back to the session definition under the comment that begins
Step 1.
23. Select the code under the comment that begins Step 10, and then click Execute to drop the session.
24. Keep SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 12-11
sys.dm_xe_session _targets
sys.dm_xe_session_events
sys.dm_xe_sessions
sys.dm_xe_session_event_actions
MCT USE ONLY. STUDENT USE PROHIBITED
12-12 Tracing Access to SQL Server with Extended Events
Lesson 2
Working with Extended Events
This lesson discusses using Extended Events. It covers common scenarios in which you might create
Extended Events sessions for troubleshooting and performance optimization, as well as the system_health
Extended Events session, which captures several events relevant to performance tuning.
Lesson Objectives
At the end of this lesson, you will be able to:
Configuring Sessions
As you have learned, Extended Events sessions are
composed from several other object types—
primarily, events and targets. Sessions also have a
number of configuration options that are set at
session level:
o ALLOW_SINGLE_EVENT_LOSS. An event can be dropped from the session if the buffers are full. A
compromise between performance and data loss, this is the default value.
o NO_EVENT_LOSS. Events are never discarded; tasks that trigger events must wait until event
buffer space is available. Potential for severe performance impact, but no data loss.
MAX_DISPATCH_LATENCY. The amount of time events will be held in event buffers before being
dispatched to targets. Defaults to 30 seconds. You may set this value to INFINITE, in which case the
buffer is only dispatched when it is full, or the session is stopped.
MAX_EVENT_SIZE. For single events larger than the size of the buffers specified by MAX_MEMORY,
use this setting. If a value is specified (in kilobytes or megabytes), it must be greater than
MAX_MEMORY.
MEMORY_PARTITION_MODE:
STARTUP_STATE. When set to ON, the session will start when SQL Server starts. The default value is
OFF.
TRACK_CAUSALITY. When set to ON, an identifier is added to each event identifying the task that
triggered the event. With this, you can determine whether one event is caused by another.
For more information about configuring a session through Transact-SQL, see the topic CREATE EVENT
SESSION (Transact-SQL) in the SQL Server Technical Documentation:
Configuring Targets
Several Extended Events targets take configuration
values when they are added to a session.
Event File
The Event File target can be used to write session
data to a file. It takes the following configuration
parameters:
filename. The file name to write to; this can
be any valid file name. If a full path is not
specified, the file will be created in the
\MSSQL\Log folder of the SQL Server instance
on which the session is created.
max_file_size. The largest size that the file may grow to; the default value is 1 GB.
max_rollover_files. The number of files that have reached max_file_size to retain. The oldest file is
deleted when this number of files is reached.
increment. The file growth increment, in megabytes. The default value is twice the size of the session
buffer.
For more information on configuring the event file target, see the topic Event File Target in the SQL Server
Technical Documentation:
Event Pairing
The Event Pairing target is used to match events that occur in pairs (for example, statement starting and
statement completing, or lock acquired and lock released), and report on beginning events that have no
matching end event. It takes the following configuration parameters:
For more information on configuring the event pairing target, see the topic Event Pairing Target in the
SQL Server Technical Documentation:
Ring Buffer
The Ring Buffer target is used to write session data into a block of memory. When the allocated memory
is full, the oldest data in the buffer is discarded and new data is written in its place. The ring buffer target
takes the following configuration parameters:
max_memory. The maximum amount of memory the ring buffer might use, in kilobytes.
max_event_limit. The maximum number of events the ring buffer might hold. The default value is
1,000.
occurrence_number. The number of events of each type to keep in the ring buffer. When events are
discarded from the buffer, this number of each event type will be preserved. The default value, zero,
means that events are discarded on a pure first-in, first-out basis.
Events are dropped from the buffer when either the max_memory or max_event_limit value is reached.
For more information on configuring the ring buffer target, see the topic Ring Buffer Target in the SQL
Server Technical Documentation:
Histogram
The Histogram target is used to partition a count of events into groups based on a specified value. It takes
the following configuration parameters:
slots. The maximum number of groups to retain. When this number of groups is reached, new values
are ignored. Optional.
filtering_event_name. The event that will be counted into groups. Optional. If not supplied, all
events are counted.
source_type. The type of object used for grouping. 0 for an event; 1 for an action.
source. The event column or action that is used to create group names and partition the count of
events.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 12-15
For more information on configuring the histogram target, see the topic Histogram Target in the SQL
Server Technical Documentation:
Histogram Target
http://aka.ms/j9qkw9
default_xe_session_name. The name for the ETW session. There can only be one ETW session on a
machine, and it will be shared between all instances of SQL Server. The default value is
XE_DEFAULT_ETW_SESSION.
default_etw_session_logfile_path. The path for the ETW log file. The default value is
%TEMP%\XEEtw.etl.
default_etw_session_logfile_size_mb. The log file size, in megabytes. The default value is 20 MB.
default_etw_session_buffer_size_kb. The event buffer size. The default value is 128 KB.
retries. The number of times to retry publishing to ETW before discarding the event. The default is 0.
For more information on configuring the Event Tracing for Windows target, see the topic Event Tracing for
Windows Target in the SQL Server Technical Documentation:
Event Counter
The Event Counter target is used to count events in a session. It takes no configuration parameters.
The sql_text and session_id for sessions that encounter a memory-related error.
The callstack, sql_text, and session_id for sessions that have waited for more than 15 seconds on
selected resources (including latches).
MCT USE ONLY. STUDENT USE PROHIBITED
12-16 Tracing Access to SQL Server with Extended Events
The callstack, sql_text, and session_id for any sessions that have waited for 30 seconds or more for
locks.
The callstack, sql_text, and session_id for any sessions that have waited for a long time for
preemptive waits. (A preemptive wait occurs when SQL Server is waiting for external API calls to
complete. The trigger time varies by wait type).
The callstack and session_id for CLR allocation and virtual allocation failures (when insufficient
memory is available).
The ring_buffer events for the memory broker, scheduler monitor, memory node OOM, security, and
connectivity. This tracks when an event is added to any of these ring buffers.
A ring buffer target, configured to hold up to 5,000 events and to occupy no more than 4 MB.
Note: The details of the system_health session are best understood by looking at its
definition. You can generate a definition from SSMS:
1. Connect SSMS Object Explorer to any SQL Server instance on which you have administrative rights.
2. In the Object Explorer pane, expand Management, expand Extended Events, and then expand
Sessions.
Right-click the system_health node, click Script As, click CREATE TO, and then click New Query
Editor Window. A script to recreate the system_health session will be generated.
Because both targets are configured to roll over and discard the oldest data they contain when they are
full, the system_health session will only contain the most recent issues. On instances of SQL Server where
the system_health session is capturing a lot of events, the targets may roll over before you can examine
specific events.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 12-17
Execution Time-outs
When a Transact-SQL statement runs for longer
than the client application’s command time-out
setting, a time-out error will be raised by the client
application. Without detailed client application
logging, it might be difficult to identify the
statement causing a time-out.
When a time-out occurs, the starting event will have no corresponding completed event, and will be
returned in the output of the event pairing target.
Troubleshooting ASYNC_NETWORK_IO
The ASYNC_NETWORK_IO wait type occurs when the database engine is waiting for a client application to
consume a result set. This can occur because the client application processes a result set row-by-row as it
is returned from the database server.
To troubleshoot this issue with Extended Events, capture the sqlos.wait_info event, filtering on wait_type
value NETWORK_IO. The histogram target might be suitable for this investigation, using either the client
application name or the client host name to define histogram groups.
Tracking Recompilations
Query execution plan recompilations occur when a plan in the plan cache is discarded and recompiled.
High numbers of plan recompilations might indicate a performance problem, and may cause CPU
pressure. Windows performance counters can be used to track overall recompilation counts for a SQL
Server instance, but more detail might be needed to investigate further.
In Extended Events, the sqlserver.sql_statement_recompile event can provide detailed information,
including the cause of recompilation.
The histogram target can be used for tracking recompilations. Group on source_database_id to identify
the database with the highest number of recompilations in an instance. Group on statement/object_id
to find the most commonly recompiled statements.
MCT USE ONLY. STUDENT USE PROHIBITED
12-18 Tracing Access to SQL Server with Extended Events
The latch_suspend_end event tracks the end of latch waits by database_id, file_id, and page_id. With
the predicate divide_evenly_by_int64, you can capture the contention specifically on allocation pages,
because the different allocation bitmap pages occur at regular intervals in a database data file. Grouping
the events using the histogram target should make it easier to identify whether latch waits are caused by
contention for allocation bitmap pages.
Mid-page splits.
Mid-page splits create fragmentation and more transaction log records due to data movement.
Tracking page splits alone, using the sqlserver.page_split event, is inefficient as it does not differentiate
between the problematic mid-page splits and normal allocation splits. The sqlserver.transaction_log
event can be used for tracking LOP_DELETE_SPLIT operation to identify the problematic page splits. A
histogram target might be most suitable for this task, grouping either on database_id (to find the
database with the most page splits) or, within a single database, on alloc_unit_id (to find the indexes with
the most page splits).
The sqlos.wait_info event can be used to track waits across multiple concurrent sessions.
The sqlserver.lock_acquired event can help with tracking the usage in most cases. For database usage, a
histogram target grouping on database_id. Object usage can be tracked by tracking SCH_M or SCH_S
locks at the object resource level by grouping on object_id in a histogram target.
You can test for this effect by comparing the number of events returned from a ring buffer in XML with
the count of events returned in the XML document header—or check the value of the truncated attribute
in the XML header. In this example, the query is comparing these values for the system_health session:
Reduce the size of the MAX_MEMORY setting for the ring buffer to reduce the likelihood that the
formatted data will exceed 4 MB. No single value is guaranteed to work; you might have to try a
setting, and be prepared to adjust it, to minimize the truncation effect while still collecting a useful
volume of data in the ring buffer.
MCT USE ONLY. STUDENT USE PROHIBITED
12-20 Tracing Access to SQL Server with Extended Events
Note: This effect is not strictly limited to the ring buffer target; it can occur on any target
that stores output in memory buffers (ring buffer target, histogram target, event pairing target,
and event counter target), but it is most likely to affect the ring buffer target because it stores
unaggregated raw data. All the other targets using memory buffers contain aggregated data, and
are therefore less likely to exceed 4 MB when formatted.
query_post_execution_showplan. Returns the actual query execution plan when a query is executed.
When using any of these events you should consider that adding them to a session, even when predicates
are used to limit the events captured, can have a significant impact on the performance of the database
engine instance. This effect is most marked with the query_post_execution_showplan event. You should
limit your use of these events to troubleshooting specific issues; they should not be included in an
Extended Events session that is always running.
Demonstration Steps
1. In SSMS, in Solution Explorer, double-click Demo 2 - track waits by session.sql.
2. In Object Explorer, expand Management, expand Extended Events, right-click Sessions, and then
click New Session.
3. In the New Session dialog box, on the General page, in the Session name box, type Waits by
Session.
4. On the Events page, in the Event library box, type wait, and then, in the list below, double-click
wait_info, to add it to the Selected events list.
6. In the Event configuration options list, on the Global Fields (Actions) tab, select session_id.
8. In the Field list, click sqlserver.session_id, in the Operator list, click >, and then in the Value box,
type 50. This filter will exclude most system sessions from the session.
10. In the Type list, click event_file, in the File name on server box, type
D:\Demofiles\Mod12\waitbysession, in the first Maximum file size box, type 5, in the second
Maximum file size box, click MB, and then click OK.
11. In Object Explorer, expand Sessions, right-click Waits by Session, and then click Start Session.
12. In File Explorer, in the D:\Demofiles\Mod12 folder, right-click start_load_1.ps1, and then click Run
with PowerShell. If a message is displayed asking you to confirm a change in execution policy, type
Y, and then press Enter. Leave the workload to run for a minute or so before proceeding.
13. In SSMS, in the Demo 2 - track waits by session.sql pane, select the code under the comment that
begins Step 14, click Execute, and then review the results.
14. Select the code under the comment that begins Step 15, and then click Execute to stop and drop the
session, and to stop the workload.
15. In File Explorer, in the D:\Demofiles\Mod12 folder, note that one (or more) files with a name
matching waitbysession*.xel have been created.
16. Close File Explorer, close SSMS without saving changes, and then in the Windows PowerShell window,
press Enter to close the window.
MCT USE ONLY. STUDENT USE PROHIBITED
12-22 Tracing Access to SQL Server with Extended Events
Categorize Activity
Place each Extended Events target type into the appropriate category. Indicate your answer by writing the
category number to the right of each item.
Items
3 Histogram target
Category 1 Category 2
Written to Written to
Memory File on Disk
Buffers
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 12-23
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
2. Wait for the workload to complete. This should take about 60 seconds.
MCT USE ONLY. STUDENT USE PROHIBITED
12-24 Tracing Access to SQL Server with Extended Events
2. Under the comment that begins Task 2, edit and execute the query to return data from the
system_health session, using the sys.fn_xe_file_target_read_file DMF to extract data from the
session’s event file target.
Hint: you can examine the definition of the system_health session to find the file name used by the
event file target.
2. Click any of the row values in the deadlock_data column to view the deadlock XML in detail.
Results: After completing this exercise, you will have extracted deadlock data from the SQL Server.
Note: Although a page_split event is available, it doesn’t provide enough information for
you to discriminate between expected page splits (which occur when a table or index is
extended, referred to as end page splits) and page splits that can harm performance (which occur
when data must be inserted in the middle of a page, referred to as mid-page splits). You can
detect mid-page splits by analyzing the transaction_log event.
2. Run a Workload
2. Create a new Extended Events session on the MIA-SQL instance with the following properties:
o Event filter(s):
operation = LOP_DELETE_SPLIT
database_name = AdventureWorks
o Session target: Histogram
Filtering target: sqlserver.transaction_log
Source: alloc_unit_id
Source type: event
2. Wait for the workload to complete. This should take about 60 seconds.
Results: After completing this exercise, you will have extracted page split data from SQL Server.
Question: If an Extended Events session has no targets defined, how would you view the
data generated by the session?
MCT USE ONLY. STUDENT USE PROHIBITED
12-26 Tracing Access to SQL Server with Extended Events
You have learned how to create, amend and drop Extended Events sessions using either SSMS or
Transact-SQL, in addition to learning about various methods for extracting data from session targets.
Review Question(s)
Module 13
Monitoring SQL Server
Contents:
Module Overview 13-1
Lesson 1: Monitoring Activity 13-2
Module Overview
The Microsoft® SQL Server® Database Engine can run for long periods without the need for
administrative attention. However, if you regularly monitor the activity that occurs on the database server,
you can deal with potential issues before they arise.
SQL Server provides a number of tools that you can use to monitor current activity and record details of
previous activity. You need to become familiar with what each of the tools does and how to use them.
It is easy to become overwhelmed by the volume of output that monitoring tools can provide, so you also
need to learn techniques for analyzing their output.
Objectives
After completing this module, you will be able to:
Lesson 1
Monitoring Activity
Dynamic management objects (DMOs) provide insights directly into the inner operations of the SQL
Server Database Engine and are useful for monitoring. SQL Server database administrators must become
familiar with some of the more useful DMOs as part of a process of ongoing server monitoring.
SQL Server Management Studio provides Activity Monitor, which you can use to investigate both current
issues such as: "Is one process being blocked by another process?" and recent historical issues such as:
"Which query has taken the most resources since the server was last restarted?" You should become
familiar with the capabilities of Activity Monitor.
The SQL Server processes also expose a set of performance-related objects and counters to Windows®
Performance Monitor. These objects and counters enable you to monitor SQL Server as a part of
monitoring the entire server.
Lesson Objectives
After completing this lesson, you will be able to:
Explain DMOs.
View activity by using DMOs.
There are more than 150 DMOs covering many categories; DMOs return server-state information that you
can use to monitor the health of a server instance, diagnose problems, and tune performance. There are
two types of DMVs and DMFs:
Server-scoped
Database-scoped
Note: Most database-scoped DMOs are available in Azure® SQL Database. Server-scoped
DMOs are typically not available in Azure SQL Database, because it is a database-level service.
The SQL Server Technical Documentation indicates the availability of a DMO on different SQL
Server service types.
All DMVs and DMFs exist in the sys schema and follow the naming convention dm_%. They are defined in
the hidden resource database and are mapped to other databases. The DMOs are organized into
categories by a naming convention, as shown in the following table:
Category Description
sys.dm_exec_% These objects provide information about connections, sessions, requests, and
query execution. For example, the sys.dm_exec_sessions DMV returns one row
for every session that is currently connected to the server.
sys.dm_os_% These objects provide access to information that is related to the SQL Server
operating system. For example, the sys.dm_os_performance_counters DMV
provides access to SQL Server performance counters without the need to access
them by using operating system tools.
sys.dm_tran_% These objects provide access to transaction management. For example, the
sys.dm_os_tran_active_transactions DMV returns details of currently active
transactions.
sys.dm_io_% These objects provide information about I/O processes. For example, the
sys.dm_io_virtual_file_stats DMF returns details of I/O performance and
statistics for each database file.
Required Permissions
To query a DMO, you must have the SELECT permission on the object and the VIEW SERVER STATE or
VIEW DATABASE STATE permission, depending on whether the object is server-scoped or database-
scoped. This enables you to selectively restrict access to DMVs and DMFs for a user or logon. To control
access for a user, first create the user in the master database (with any user name) and then deny the user
the SELECT permission on the DMVs or DMFs that you do not want him or her to access. After this, the
user cannot select from these DMVs and DMFs, regardless of the database context of the user, because
the DENY permission within that database context is processed first.
For more information about DMVs, see the topic Dynamic Management Views and Functions (Transact-
SQL) in the SQL Server Technical Documentation:
Historical Information
The second type of DMO returns historical information. This information typically takes the form of
counters or metrics that are aggregated while the database engine instance is running. For example, you
saw that the sys.dm_os_waiting_tasks view returned details of tasks that are currently waiting on
resources. By comparison, the sys.dm_os_wait_stats view returns information about how often and how
long any task had to wait for a specific wait_type since the SQL Server instance started.
Another useful example of a historical function is the sys.dm_io_virtual_file_stats DMF, which returns
information about the performance of database files.
For more information about the sys.dm_exec_sessions DMV, see the topic sys.dm_exec_sessions
(Transact-SQL) in the SQL Server Technical Documentation:
sys.dm_exec_sessions (Transact-SQL)
https://aka.ms/Rm8m7a
For more information about the sys.dm_os_waiting_tasks DMV, see the topic sys.dm_os_waiting_tasks
(Transact-SQL) in the SQL Server Technical Documentation:
sys.dm_os_waiting_tasks (Transact-SQL)
https://aka.ms/Iicg1s
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 13-5
For more information about the sys.dm_os_wait_stats DMV, see the topic sys.dm_os_wait_stats (Transact-
SQL) in the SQL Server Technical Documentation:
sys.dm_os_wait_stats (Transact-SQL)
https://aka.ms/Fzkyf1
For more information about the sys.dm_io_virtual_file_stats DMF, see the topic
sys.dm_io_virtual_file_stats (Transact-SQL) in the SQL Server Technical Documentation:
sys.dm_io_virtual_file_stats (Transact-SQL)
https://aka.ms/Upoe0q
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
2. In the D:\Demofiles\Mod13 folder, right-click Setup.cmd, and then click Run as administrator.
3. In the User Account Control dialog box, click Yes, and then wait for the script to complete.
4. In the D:\Demofiles\Mod13 folder, double-click Workload1.cmd.
5. Start SQL Server Management Studio, and then connect to the MIA-SQL database engine instance
by using Windows authentication.
6. On the file menu, point to Open, click Project/Solution.
7. In the Open Project dialog box, navigate to the D:\Demofiles\Mod13\Demo folder, and then
double-click Demo.ssmssln.
9. Execute the code under the heading that begins with Task 2 to view currently executing requests.
Approximately 50 rows should be returned, but most are system requests.
10. Execute the code under the heading that begins with Task 3 to view user processes.
11. Execute the code under the heading that begins with Task 4 to filter executing requests by user
sessions to show only user activity.
12. Execute the code under the heading that begins with Task 5 to retrieve details of the Transact-SQL
batch that is associated with each request.
13. Execute the code under the heading that begins with Task 6 to show details of the Transact-SQL
statement that is currently executing within each batch. This statement is complex, but it is
fundamentally a substring operation on the results of the previous step.
14. Execute the code under the heading that begins with Task 7 to stop the workload script. (The
workload is configured to stop when the ##stopload global temporary table is created.)
MCT USE ONLY. STUDENT USE PROHIBITED
13-6 Monitoring SQL Server
15. Execute the code under the heading that begins with Task 8 to examine the contents of the query
plan cache.
16. Execute the code under the heading that begins with Task 9 to identify the top 10 most expensive
queries in the query plan cache, based on average logical reads.
17. Execute the code under the heading that begins with Task 10 to view I/O statistics for database files.
18. Execute the code under the heading that begins with Task 11 to view wait statistics. The purpose of
this query is to demonstrate the range of wait types that wait statistics are collected for.
Processes. This section includes detailed information about processes, their IDs, logons, databases,
and commands. This section will also show details of processes that are blocking other processes.
Resource Waits. This section shows categories of processes that are waiting for resources and
information about the wait times.
Data File I/O. This section shows information about the physical database files in use and their recent
performance.
Recent Expensive Queries. This section shows detailed information about the most expensive recent
queries and the resources that those queries consumed. You can right-click the queries in this section
to view either the query or an execution plan for the query. When any pane is expanded, Activity
Monitor queries the instance for information. When a pane is collapsed, all querying activity stops for
that pane. You can expand more than one pane at the same time to view different kinds of activity on
the instance.
Note: The values that Activity Monitor exposes are also accessible by using DMOs. The
tooltips for many of the column headers in Activity Monitor sections describe which DMO
contains the corresponding information.
For more information about Activity Monitor, see the topic Activity Monitor in the SQL Server Technical
Documentation:
Activity Monitor
https://aka.ms/Thoa2y
Demonstration Steps
1. In SQL Server Management Studio, in Solution Explorer, double-click Demo 2a - blocker.sql.
2. Review the contents of the file and notice that it starts a transaction without committing it, and then
click Execute.
3. In Solution Explorer, double-click Demo 2b - blocked.sql, and then click Execute. Notice that a
result set is not returned and the query remains running; this query is blocked from accessing the
HR.Employees table by the transaction that you opened in the previous step.
5. In the MIA-SQL Activity Monitor pane, click Processes to expand the Processes section.
6. In the Processes section, in the Database column, in the column header, click the Filter button, and
then click InternetSales.
7. Notice that one of the processes has a Task State of SUSPENDED; this is the query that is running in
the Demo 2b - blocked.sql query pane.
8. Point to the column header for the Blocked By column to demonstrate that the column tooltip
describes the DMO column that contains the information—the
sys.dm_os_waiting_tasks.blocking_session_id.
9. In the SUSPENDED row, note the value in the Blocked By column. This is the session ID of the
blocking session—the query in the Demo 2a - blocker.sql query pane.
10. In the Processes section, in the Session ID column, in the column header, click the Filter button, and
then click the value of the session that you identified as the blocker in the previous step. Only one
row should now be visible in the Processes section. Notice that the value in the Head Blocker
column is 1. This indicates that this session is the first in a blocking chain.
11. In the Processes section, right-click the row, and then click Kill Process.
12. In the Kill Process dialog box, click Yes to roll back the query in the Demo 2a - blocker.sql query
pane.
14. In the Demo 2b - blocked.sql query pane, notice that the query has completed because the block
has been removed.
16. In the Microsoft SQL Server Management Studio dialog box, click No.
17. Leave SQL Server Management Studio open for the next demonstration.
The main focus of the Windows Performance Monitor is on monitoring CPU, memory, disk system, and
network. After you install SQL Server, a number of objects and counters that are related to SQL Server are
available within Windows Performance Monitor.
For more information about Performance Monitor, see the topic Windows Performance Monitor on
Technet:
Windows Performance Monitor
http://aka.ms/Faox3k
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 13-9
SQL Server provides objects and counters that Performance Monitor uses to monitor activity in computers
that are running an instance of SQL Server. In this context, an object is any SQL Server resource, such as a
SQL Server lock or Windows process. Each object contains one or more counters that determine various
aspects of the objects to monitor. For example, the SQLServer:Locks object contains counters called
Number of Deadlocks/sec and Lock Timeouts/sec.
The SQLServer:Databases object type has one instance for each database on SQL Server. Some object
types (for example, the SQLServer:Memory Manager object) have only one instance. If an object type
has multiple instances, you can add counters to track statistics for each instance, or in many cases, all
instances at once. Counters for the default instance appear in the format SQLServer:<object name>.
Counters for named instances appear in the format MSSQL$<instance name>:<counter name> or
SQLAgent$<instance name>:<counter name>.
You can specify the SQL Server objects and counters that are monitored when Performance Monitor is
started.
You can also configure Performance Monitor to display statistics from any SQL Server counter. In addition,
you can set a threshold value for any SQL Server counter, and then generate an alert when a counter
exceeds a predefined threshold.
Note: SQL Server statistics are displayed only when an instance of SQL Server is installed. If
you stop and restart an instance of SQL Server, the display of statistics is interrupted and resumes
automatically. Also note that you will see SQL Server counters in the Performance Monitor snap-
in, even if SQL Server is not running. On a clustered instance, performance counters only function
on the node where SQL Server is running.
SQL Server counters for Performance Monitor can be queried from Transact-SQL statements by using the
sys.dm_os_performance_counters system DMV.
MCT USE ONLY. STUDENT USE PROHIBITED
13-10 Monitoring SQL Server
For more information about SQL Server objects and counters in Performance Monitor, see the topic Use
SQL Server Objects in the SQL Server Technical Documentation:
Use SQL Server Objects
https://aka.ms/Lwwntb
For more information about the sys.dm_os_performance_counters system DMV, see the topic
sys.dm_os_performance_counters (Transact-SQL) in the SQL Server Technical Documentation:
sys.dm_os_performance_counters (Transact-SQL)
https://aka.ms/Dtkfv2
Demonstration Steps
1. Click Start, click Administrative Tools, and then double-click Performance Monitor.
2. In the Performance Monitor window, in the left pane, expand Data Collector Sets, expand System,
and then, click System Performance.
4. In the Performance Counter Properties dialog box, observe the different performance counters that
this set collects, and then click Cancel.
5. In the left pane, right-click System Performance, and then click Start. Note that the symbol for the
System Performance collector set changes to reflect that it is running. The collector set will collect
data for one minute before it stops automatically. Wait for the collector set to finish, and the symbol
to change back to its original form.
6. On the Action menu, click Latest Report. Notice that new nodes are added to the tree in the left
pane. In the right pane, expand each section of the report to demonstrate the information that has
been collected.
7. When you have finished reviewing the report, in the left pane, click Performance Monitor. Notice
that the right pane changes to show a chart, with the % Processor Time counter preselected.
8. In the right pane, right-click anywhere in the pane, and then click Add Counters.
9. In the Add Counters dialog box, in the Available counters list, click the following counters, and then
click Add >> to add them:
11. Review the changes to the Performance Monitor chart and then close Performance Monitor.
12. In SQL Server Management Studio, in Solution Explorer, double-click Demo 3 - counters.sql.
13. Execute the code in the file to demonstrate that SQL Server Performance Monitor counters are
accessible by using the sys.dm_os_performance_counters system DMV.
15. Leave SQL Server Management Studio open for the next demonstration.
Question: Why might you use the sys.dm_os_performance_counters system DMV, instead
of Performance Monitor, to access SQL Server counters?
MCT USE ONLY. STUDENT USE PROHIBITED
13-12 Monitoring SQL Server
Lesson 2
Capturing and Managing Performance Data
You have seen that DMOs provide useful information about the state of the system. The values that the
DMOs provide are not generally persisted in any way and only reside in memory while the server is
running. When the server instance is restarted, these values are reset.
When DMVs and DMFs were first introduced in SQL Server 2005, it was common for users to want to
persist the values that the DMVs and DMFs provided. To this end, many users created a database to hold
the values, and then created a job that would periodically collect and save the values.
The data collector system that was introduced with SQL Server 2008 formalizes this concept by creating a
central warehouse for holding performance data, jobs for collecting and uploading the data to the
warehouse, and a set of high quality reports that can be used to analyze the data. This lesson describes
how to set up and configure the data collector. The next lesson describes the reports that are available
from the data that the data collector has gathered.
Lesson Objectives
After completing this lesson, you will be able to:
Explain the role of the data collector.
You can collect information from several locations. The data collector can:
Query DMOs to retrieve detailed information about the operation of the system.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 13-13
Retrieve performance counters that provide metrics about the performance of both SQL Server and
the entire server.
Capture SQL Trace events that have occurred.
In addition to the system data collection sets, the SQL Server data collector can be extended by the
creation of user-defined data collection sets. The ability to add user-defined data collection sets enables
users to specify the data that they want to collect, and to use the infrastructure that the SQL Server data
collector provides to collect and centralize the data.
To function, the SQL Server data collector depends upon a combination of SQL Server Agent jobs and SQL
Server Integration Services (SSIS) packages. You will find the SQL Server data collector easier to work with
if you are already familiar with these technologies.
For more information about the SQL Server data collector, see the topic Data Collection in the SQL Server
Technical Documentation:
Data Collection
https://aka.ms/Wy1dug
A large enterprise should consider the use of a stand-alone system for the management data warehouse.
There are two goals for creating a centralized management data warehouse:
You can access reports that combine information for all server instances in your enterprise.
You can offload the need to hold collected data, and to report on it from the production servers.
MCT USE ONLY. STUDENT USE PROHIBITED
13-14 Monitoring SQL Server
The data collector has two methods for uploading captured performance data into the central warehouse.
Low volume information is sent immediately to the warehouse. Higher volume information is cached
locally first, and then uploaded to the warehouse by using SSIS.
For more information about the management data warehouse, including a detailed description of the
database schema, see the topic Management Data Warehouse in the SQL Server Technical
Documentation:
System data collection sets are created automatically during the setup of SQL Server data collection. You
can enable and disable them as needed, and both the frequency of collection and the retention periods
for collected data can be customized for system data collection sets and for user-defined data collection
sets.
You must also consider the security requirements for the data collector. Security requirements are
discussed in the next topic.
Note: If the SQL Server Agent has been configured to run as a System service account you
may need to configure a proxy to upload the data into the management data warehouse.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 13-15
For more information about how to configure SQL Server data collection, see the topic Configure the
Management Data Warehouse (SQL Server Management Studio) in the SQL Server Technical
Documentation:
mdw_writer. Members of this role can write and upload data to the management data warehouse.
All of the data collectors that write to an instance of the management data warehouse must be a
member of this role.
Note: In addition to needing to be a member of the mdw_writer role to upload data, the
jobs that collect the data need whatever permissions are required to access the data that they are
collecting.
mdw_reader. Members of this role can read data from the management data warehouse. A user
needs to be a member of the mdw_reader role to be able to access the rich reports or directly read
the collected data.
dc_admin. Members of this role have full control of the data collection configuration, including:
Note: Members of the dc_admin role must configure SQL Server Agent jobs, so the
dc_admin role is a member of the SQLServerAgentUser role in the msdb system database.
For more information about the SQL Server Agent fixed roles, see the topic SQL Server Agent Fixed
Database Roles in the SQL Server Technical Documentation:
dc_operator. Members of this role can read and update data collection configuration, including:
o Changing the collection frequency for collection items that are part of a collection set.
dc_proxy. Members of this role have read access to data collection configuration. They can also
create SQL Server Agent job steps that use an existing proxy account, and execute SQL Server Agent
jobs that they own.
Note: Members of the dc_operator and dc_proxy roles must interact with SSIS packages,
so both roles are a member of the db_ssisltduser and db_ssisoperator fixed roles in the msdb
system database.
For more information about the SSIS fixed roles, see the topic Integration Services Roles (SSIS Service) in
the SQL Server Technical Documentation:
Integration Services Roles (SSIS Service)
https://aka.ms/W82x28
For more information about data collection security, see the topic Data Collector Security in the SQL
Server Technical Documentation:
The data that is logged into the msdb system database is kept with the same retention period settings as
the data collection sets that it relates to. The information that is retained can be viewed through the SQL
Server Management Studio log file viewer or by querying the following objects in the msdb system
database:
Three levels of logging are available and can be set by calling the sp_syscollector_update_collection_set
system stored procedure. The lowest level of logging records starts and stops of collector activity. The
next level of logging adds execution statistics and progress reports. The highest level of logging adds
detailed SSIS package logging.
For more information about working with the data collector, see the topic Manage Data Collection in the
SQL Server Technical Documentation:
Demonstration Steps
Configure the Management Data Warehouse
1. In Object Explorer, under MIA-SQL, expand Management, right-click Data Collection, point to
Tasks, and then click Configure Management Data Warehouse.
4. In the New Database dialog box, in the Database name box, type MDW, and then click OK.
6. On the Map Logins and Users page, review the available options, and then click Next.
8. Wait for the configuration process to complete, and then click Close.
Enroll the MIA-SQL Instance for Data Collection
1. In Object Explorer, under Management, right-click Data Collection, point to Tasks, and then click
Configure Data Collection.
2. In the Configure Data Collection Wizard window, click Next.
3. On the Setup Data Collection Sets page, next to the Server name box, click the Ellipsis (…) button.
4. In the Connect to Server dialog box, verify that the Server name box has the value MIA-SQL, and
then click Connect.
5. On the Setup Data Collection Sets page, in the Database name list, click MDW.
6. Under Select data collector sets you want to enable, select the System Data Collection Sets
check box, and then click Next.
8. Wait for the configuration process to complete, and then click Close.
1. In Object Explorer, under Management, expand Data Collection, and then expand System Data
Collection Sets. Observe that four collection sets are available but one of them is stopped.
3. In the Data Collection Set Properties dialog box, on the General page, click Pick.
4. In the Pick Schedule for Job dialog box, click CollectorSchedule_Every_5min (the row with ID = 2),
and then click OK.
7. Click Execute.
8. In File Explorer, in the D:\Demofiles\Mod13 folder, start Workload1.cmd and allow it to run. This
script will generate some activity that will appear in the data collection reports in the demonstrations
in the next lesson.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Lesson 3
Analyzing Collected Performance Data
After performance data has been collected from a number of server instances and consolidated into a
management data warehouse, the data can then be analyzed. You can write your own custom reports by
using SQL Server Reporting Services, either by using custom reports in SQL Server Management Studio or
by using Transact-SQL queries. Most users will find the standard reports that are supplied with SQL Server
to be sufficient, without the need to write additional reports. You must be familiar with the information
that is contained in the standard reports, and you must know how to navigate within the reports.
Lesson Objectives
After completing this lesson, you will be able to:
You can use these reports to obtain information for monitoring system capacity and troubleshooting
system performance.
Note: Each report can be printed or exported to PDF or Microsoft Excel® files for further
analysis.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 13-21
For more information about data collection reports, see the topic System Data Collection Set Reports in
the SQL Server Technical Documentation:
System Data Collection Set Reports
https://aka.ms/Euotdi
For information about how to access a data collection set report, see the topic View a Collection Set
Report (SQL Server Management Studio) in the SQL Server Technical Documentation:
This report also includes a number of hyperlinks that lead to a set of linked subreports that provide a
more detailed view of the usage data:
Disk usage for database. This subreport is displayed when you click a database name in the Disk
Usage summary report. It shows pie charts of data file usage for a single database.
Disk usage collection set - database. This subreport is displayed when you click a trend link in the
Disk Usage summary report. It shows the historical data that the Disk Usage data collection set has
gathered for a single database, together with a chart that maps the percentage of free space in the
database data files.
MCT USE ONLY. STUDENT USE PROHIBITED
13-22 Monitoring SQL Server
Demonstration Steps
Force a Data Collection
1. In SQL Server Management Studio, in Object Explorer, under Management, under Data Collection,
under System Data Collection Sets, right-click Disk Usage, and then click Collect and Upload
Now.
2. In the Collect and upload Data Collection Set dialog box, click Close.
3. In Object Explorer, right-click Query Statistics, and then click Collect and Upload Now.
4. In the Collect and upload Data Collection Set dialog box, click Close.
5. In Object Explorer, right-click Server Activity, and then click Collect and Upload Now.
6. In the Collect and upload Data Collection Set dialog box, click Close.
1. In Object Explorer, under MIA-SQL, expand Databases, right-click MDW, point to Reports, point to
Management Data Warehouse, and then click Management Data Warehouse Overview.
2. Review the hyperlinks under each report name that show the last data collection date and time.
Under Disk Usage, click the date and time hyperlink.
3. In the Disk Usage Collection Set report, observe the data that is available in the report, and then
click InternetSales.
4. In the Disk usage for database: InternetSales report, observe the available information, and in the
report pane, in the upper-left corner, click the Navigate Backward button to return to the Disk
Usage Collection Set report.
5. In the Log Trend column, click the trend line for the MDW database. Note that you must click the
line itself; clicking elsewhere in the cell that contains the line will not work.
6. In the Disk Usage Collection Set - Log: [MDW] report, observe the information that is available in
the report. Note that the chart in this report is based on percentage of free space in the data file.
7. Click the Navigate Backward button to return to the Disk Usage Collection Set report.
8. In the report pane, click the Navigate Backward button to return to the Management Data
Warehouse Overview: MDW report.
9. Leave SQL Server Management Studio open for the next demonstration.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 13-23
This report has a large number of linked subreports that provide much deeper information than appears
on the initial summary. The initial report is a dashboard that provides an overview. If you investigate this
report, you will find that almost every item that is displayed is a link to a more detailed subreport. For
example, you can click a trend line in a graph to find out the values that make up the trend.
By default, the main report shows data for the whole data collection period. You can restrict your view to
a time range or an individual data collection instance by using the timeline control at the top of the
report or alternatively by using the Calendar button.
The report has six charts, each of which reports on a different aspect of server activity. You can click the
data lines on four of the charts to drill through to a more detailed report:
%CPU
Memory Usage
For the remaining two charts, click the chart title to drill through to a more detailed view:
Demonstration Steps
1. In SQL Server Management Studio, in the Management Data Warehouse Overview pane, in the
report pane, in the upper-left corner, click the Refresh button.
2. In the main report, under Server Activity, click the date and time hyperlink.
3. In the Server Activity History report, observe the timeline and the six charts. Explain that some of
the charts are empty because no relevant activity has taken place.
4. Demonstrate the use of the timeline; under the timeline, click the Zoom In button twice. Note how
the range of dark blue squares, each representing a data collection point, reduces each time that you
click, and the range of data in the charts changes to match the selected time range.
MCT USE ONLY. STUDENT USE PROHIBITED
13-24 Monitoring SQL Server
5. In the Memory Usage chart, click the lower line to drill through.
6. In the SQL Server Memory Usage report, scroll down to view the SQL Server Internal Memory
Consumption By Type chart.
7. Expand the Average Memory Use by Component section of the report to view detailed memory
usage information for each SQL Server component.
8. In the report pane, in the upper-left corner, click the Navigate Backward button to return to the
Server Activity History report.
9. In the report pane, in the top left, click the Navigate Backward button five times to return to the
Management Data Warehouse Overview report.
10. Leave SQL Server Management Studio open for the next demonstration.
Duration
CPU
Logical writes
Physical reads
Execution count
You control the time range of the report by using a timeline in a similar fashion to the Server Activity
report. For a given time range, you can view the top 10 most expensive queries by each of the cost
metrics that are used to collect the data (such as duration and CPU).
This report also includes linked subreports that you can use to access higher levels of detail. As an
example, you can retrieve query plans from the expensive queries that were in memory at the time that
the capture was performed.
Note: The data that the Query Statistics report returns has much in common with the
Query Store—a new feature in SQL Server. The key difference is that Query Store is configured at
database level, but the Query Statistics report covers one or more database engine instances in
one report.
Demonstration Steps
1. In SQL Server Management Studio, in the Management Data Warehouse Overview pane, in the
report pane, in the upper-left corner, click the Refresh button.
2. In the main report, under Query Statistics, click the date and time hyperlink.
3. In the Query Statistics History report, observe that the report uses the same timeline control as the
Server Activity report to navigate through the data.
4. Under the Navigate through the historical snapshots of data using the time line below, click the
icon.
5. Under the Top Queries by Total CPU chart, click Duration, click Total I/O, click Physical Reads,
and then click Logical Writes to demonstrate the different views of the data.
8. In the Query Details report, scroll down to view the various components of the report.
9. At the bottom of the report, in the Top Query Plans By Average CPU Per Execution table, in the
Plan # column, click the first row.
10. In the Query Plan Details report, scroll down, and in the Query Execution Statistics section, click
View graphical query execution plan.
11. Note that a new pane opens that contains the query plan.
12. Close SQL Server Management Studio without saving any changes.
MCT USE ONLY. STUDENT USE PROHIBITED
13-26 Monitoring SQL Server
You have been tasked with ensuring that all the databases are configured to store performance data and
are able to produce reports to provide proof that there are no health issues. Your manager is keen to be
notified if during this process you uncover any issues that need resolving.
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
2. Note that the top most expensive query begins with SELECT total.name.
3. Use SQL Server Management Studio to create a new Management Data Warehouse named MDW.
MCT USE ONLY. STUDENT USE PROHIBITED
13-28 Monitoring SQL Server
2. Change the frequency of the data capture for Query Statistics to every 5 minutes every day.
3. View the Query Statics History report, and drill down to view the graphical query execution plan for
the SELECT total.Name query.
Note: In a production environment you should consider the frequency of data you capture
and the disk space required against the cost of storing that data.
Results: After completing this exercise, you should have configured a Management Data Warehouse
called MDW on the MIA-SQL instance.
You use a combination of the Activity Monitor and Management Data Warehouse to find a problem and
make recommendations to your manager that will hopefully resolve their performance issues.
2. In the User Account Control dialog box, click Yes, leave the window open as it is creating a load on
the database.
2. Has SQL Server identified any issues with the query? How would you resolve the issues?
3. Using the Management Data Warehouse, use the Query Statistics History report to view the Query
Plan Details report.
4. Note that the report shows the same issues as the execution plan from Activity Monitor.
Use Activity Monitor to see which running queries are the most expensive
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 13-29
Question: What are the benefits of using a central data warehouse for SQL Server
performance data, instead of local collection on each server?
MCT USE ONLY. STUDENT USE PROHIBITED
13-30 Monitoring SQL Server
Use Activity Monitor for easy access to the most relevant information.
Use Performance Monitor to gather metrics for Windows and SQL Server.
Review Question(s)
Question: Which SQL Server activity monitoring tools are best suited to your organization’s
needs?
MCT USE ONLY. STUDENT USE PROHIBITED
14-1
Module 14
Troubleshooting SQL Server
Contents:
Module Overview 14-1
Lesson 1: A Troubleshooting Methodology for SQL Server 14-2
Module Overview
Database administrators working with Microsoft® SQL Server® need to adopt the important role of
troubleshooter when issues arise—particularly if users of business-critical applications that rely on SQL
Server databases are being prevented from working. It is important to have a solid methodology for
resolving issues in general, and to be familiar with the most common issues that can arise when working
with SQL Server systems.
Objectives
After completing this module, you will be able to:
Lesson 1
A Troubleshooting Methodology for SQL Server
Before starting to try to resolve any issue, it’s important to be prepared to apply a logical troubleshooting
methodology in a consistent manner. Although troubleshooting is often regarded as an art as much as a
science, there are a number of characteristics common to many good troubleshooters. You should aim to
develop or emulate those characteristics to ensure that you can successfully troubleshoot issues in your
organization.
Lesson Objectives
After completing this lesson, you will be able to:
Quickly decides upon a cause for problems—without justification or evidence—and then tries to
justify the selection of the cause.
Investigation Phase
This is a critical phase. Before you can solve any
problem, you need to be very clear in your
understanding of what the problem is. It can be
very tempting to bypass this step and jump
directly to attempting solutions, especially when a
serious problem occurs or you are under pressure.
However, acting without sufficient information is often a waste of effort, and might actually make the
issue worse.
You need to understand what does and doesn’t work before attempting to resolve a problem. Gather as
much information as you can about the circumstances in which the issue occurred. Some questions you
might ask include:
When did the issue occur? Knowing the time and date that an issue occurred can help you find
information from application and operating system log files, or might suggest that the problem has a
link to another event.
In what environment did the error occur? This is a broad category, which will vary a lot depending
on the nature of the problem and the system you are troubleshooting. Environmental characteristics
that might be helpful when troubleshooting include:
What was the user doing when the issue occurred? Knowing how the problem presents itself to
the user can help you understand which part of a system, or which sections of code, might be causing
the problem. One very important concept in this phase is that the issue needs to be defined from the
affected user’s point of view, not from the assumed perspective of an IT professional. For example,
there is no point telling a user that a system is working if they cannot use it for any reason—
regardless of how your IT-based perspective might tell you that the system is working.
Were any error messages displayed? If so, what was the error number and/or error text?
Application error messages can often give a clear indication of the cause of an issue. System errors
from the SQL Server Database Engine often include a clear description of their causes.
Can the problem be reproduced, or was it a one-off? If the problem can be reproduced, what
are the steps to take? Troubleshooting is always easier if you know how to reproduce the issue,
because this makes it much easier to investigate causes and test your fix.
How many users have experienced the problem? If an issue affects a subset of users, the
differences between affected and unaffected groups might suggest the cause of the problem.
MCT USE ONLY. STUDENT USE PROHIBITED
14-4 Troubleshooting SQL Server
When did the affected component last work correctly? Sometimes, when a user complains that
something doesn't work, it might be that it has never worked. Verify that there was a time when the
system worked as expected, and find out when that was.
What changes have been made to the affected system since the component last worked
correctly? Troubleshooting is often the process of identifying the change that causes a working
system to go into a broken state; knowing details of the changes that have occurred can help you
identify the cause.
Note: These questions are mainly phrased in terms of an end user reporting a problem.
However, most of these questions are valid even when there is no end user of a system—for
example, in a problem involving an automated process, such as a SQL Server Agent job.
Finally, you need to know how the user would decide that the issue is resolved. A common mistake when
troubleshooting is to find a problem, assume that it is the cause of the issue, resolve that problem, and
then conclude that the original issue is now settled—without verifying that the original issue is actually
resolved.
Analysis Phase
In the analysis phase, you need to determine all the possible causes of the issue that you are trying to
resolve. At this point, it is important to avoid excluding any potential causes, no matter how unlikely you
consider them to be. The information you gather in the investigation phase might suggest potential
causes.
A discussion with one or more other people is often useful in this phase, particularly if you can obtain
alternative viewpoints. The analysis phase often benefits from two types of people—one with an excellent
technical knowledge of the product and another who constantly requires the first person to justify their
thoughts, while thinking both logically and laterally.
Implementation Phase
In the implementation phase, you need to eliminate each potential cause. This process of elimination
usually returns the best results when the potential causes are ruled out in order, from the most likely to
the least likely. The information you gathered in the investigation phase will help you assess the relative
likelihood of possible causes; a better possible cause explains the behavior you have observed, and the
more likely it is to be the cause of a problem.
The critical aspect of the implementation phase is to make sure that your reasons for eliminating potential
causes are logically valid. If you reach the end of your list of potential causes and have not yet found a
solution to the issue, you should return to the analysis phase and recheck your thinking. If you cannot find
a problem in your analysis, you might need to recheck your initial assumptions in the investigation phase,
or gather more information.
Validation Phase
It is easy, particularly when you are new to troubleshooting, to assume that problems are resolved when
they are not. Do not assume that, because you have found and resolved a problem, this was the original
one that you were aiming to solve.
In the investigation phase, you should have determined how the user would decide if the issue is resolved.
In the validation phase, you must apply that test to see if the issue really is resolved.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 14-5
Documentation
After the problem is resolved, it is good practice to document your findings. This means that other
members of your organization can understand and learn from the issue, and also provides a useful guide
to resolving the issue should it occur again in the future.
Sequencing Activity
Number each of the following troubleshooting phases to indicate their correct order.
Steps
Investigation Phase
Analysis Phase
Implementation Phase
Validation Phase
Create Documentation
MCT USE ONLY. STUDENT USE PROHIBITED
14-6 Troubleshooting SQL Server
Lesson 2
Resolving Service-Related Issues
In the remainder of this module, you will see how to approach common types of issues that can arise
when working with SQL Server systems.
SQL Server comprises several Windows® services. While troubleshooting these services has much in
common with troubleshooting all Windows services, there are some considerations specific to SQL Server.
This lesson covers the types of issue involving problems with SQL Server services.
Lesson Objectives
After completing this lesson, you will be able to:
For more information about using the DAC, see the topic Diagnostic Connection for Database
Administrators in the SQL Server Technical Documentation:
Diagnostic Connection for Database Administrators
https://aka.ms/Ihvyn9
If SQL Server will not start but the issue is not caused by a problem with the service account, check:
Whether the SQL Server log files indicate that the master or model database is corrupt. If either is
corrupt, follow the procedures to recover the databases.
Whether the file paths to the tempdb database files are accessible. SQL Server recreates the tempdb
database each time the server instance starts, but the path to the database files (as configured in the
master database) must exist and be accessible to the service account under which the SQL Server
service is running.
Whether you can start the instance by using the command prompt. If starting SQL Server from a
command prompt does work, check the configuration of the service and make sure that the
permission requirements are met.
For more information about the steps to rebuild system databases, see the topic Rebuild System
Databases in the SQL Server Technical Documentation:
For more information about starting the SQL Server Database Engine from the command prompt, see the
topic sqlservr Application in the SQL Server Technical Documentation:
sqlservr Application
https://aka.ms/Schfvr
SQL Server keeps a set of archive log files that you can also review, because problems may have been
occurring for some time. By default, backups of the previous six log files are retained. The most recent log
archive has the extension .1, the second most recent the extension .2, and so on. The current error log has
no extension. You can increase the number of log files to retain by customizing the log configuration, but
you cannot choose to retain fewer than six log files.
For more information about viewing SQL Server log files from a remote instance, see the topic View
Offline Log Files in the SQL Server Technical Documentation:
For instructions about how to access Windows logs, see the topic View the Windows Application Log
(Windows) in the SQL Server Technical Documentation:
View the Windows Application Log (Windows 10)
https://aka.ms/Lu1s3x
To reduce the number of log events shown in the log viewer, apply a filter to include only those events
with an event level of Critical, Error, or Warning. This will restrict your view to only the most significant
events.
Note: When you are reviewing Windows event log data, be conscious that errors or
warnings not generated by SQL Server services can have an impact on SQL Server’s behavior. For
example, a message from a storage controller indicating that a drive in a RAID array has failed
might indicate a problem with SQL Server’s I/O performance; or a message about an incorrectly
configured network adapter might indicate that query results will be delivered more slowly to
client applications.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 14-9
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the password
Pa55w.rd.
4. Click the Start button, then type Configuration Manager, click SQL Server Configuration
Manager.
6. In SQL Server Configuration Manager, in the left pane, click SQL Server Services.
7. In right pane, note that the SQL Server (SQL2) service is not running.
8. Right-click SQL Server (SQL2) and click Start. Note that the service does not start successfully, and
an error message is returned.
10. To check the Windows system log, click Start, type Event Viewer, and press Enter.
11. In Event Viewer, in the left pane, expand Windows Logs, and then click System.
12. Click on the most recent message with a Level of Error and a Source of Service Control Manager.
13. On the Details tab, note that the error message states that there is a service specific error, and
provides only the following details:
15. To check the SQL Server error log, start File Explorer, and navigate to C:\Program Files\Microsoft
SQL Server\MSSQL14.SQL2\MSSQL\Log.
17. Right-click ERRORLOG, click Open with, and then click Notepad. Notice the last three lines of the
file include the error number displayed in the Windows system log (17113), and a detailed description
of the problem (scroll to the right to read the full error message). The message indicates that the data
file for the master database cannot be found.
18. In File Explorer, navigate to the folder location mentioned in the error message (C:\Program
Files\Microsoft SQL Server\MSSQL14.SQL2\MSSQL\DATA).
19. If a DATA dialog box appears, click Continue. Notice that the folder contains the file
master.AV0001. The demonstration simulates the situation where the master.mdf file has been
quarantined by an antivirus application, which has renamed it master.AV0001.
(In a real-world scenario, you would need to recover the file from the quarantine system, and prevent
the antivirus application from scanning this folder, before attempting to restart the service.)
MCT USE ONLY. STUDENT USE PROHIBITED
14-10 Troubleshooting SQL Server
20. For the purposes of this demonstration, rename the file master.AV0001 to master.mdf; right-click
master.AV0001 then click Rename.
21. Replace the AV0001 file extension with mdf, then press Enter.
24. In SQL Server Configuration Manager, in the right pane, right-click SQL Server (SQL2), and then click
Start. Note that the service starts successfully.
Lesson 3
Resolving Connectivity and Login Issues
Database administrators who work with SQL Server are commonly called upon to resolve problems with
connectivity and logins. Users might have problems making connections to SQL Server and, in addition to
resolving problems with network names, protocols, and ports, you might also have to investigate issues
with logins and passwords.
Best Practice: Using logins based on Windows authentication removes the need for you, as
a database administrator, to deal with most password and authentication-related issues.
Lesson Objectives
After completing this lesson, you will be able to:
Network-Related Issues
If a login attempt using shared memory succeeds, the problem is likely to be network-related. In rare
cases, you may see a problem with incompatible network libraries on the client and the server, but most
problems are much less subtle. The following table lists some tests for common causes of network-related
connectivity issues:
Test Action
Can the server name be resolved? If the client application is attempting to connect to a
database engine instance by name, it must be possible for
the client to resolve the server’s network name to a routable
address. In TCP/IP networks, this will typically require a DNS
lookup. You can use command-line tools such as ping and
nslookup to confirm whether the server name can be
correctly resolved to an IP address, and to find out the DNS
server being used to resolve DNS queries. Another indicator
of name resolution problems is that you can connect to an
instance by specifying the server IP address, but an attempt
to connect using the server name fails.
Can the network and the server While a server name might be successfully resolved to a
be reached? network address, no connectivity may be possible between
the client and server systems. On TCP/IP networks, you might
be able to test the network route from the client to the
server using command-line tools such as ping or tracert.
However, you should be aware that, on some networks, the
ICMP protocol used by these tools is disabled, in which case
the tests will be inconclusive. Problems in this area might
indicate that changes need to be made to the configuration
network devices, such as switches and gateways.
Is the client attempting to On a server with more than one network interface, SQL
connect on the correct network Server might not be configured to accept connections on all
interface? of them. You can verify which network interfaces a database
engine instance is configured to use through SQL Server
Configuration Manager. If an instance is not listening on all
of a server’s network interfaces, verify that the client is
attempting to connect to an interface being used by SQL
Server.
Is the server configured to accept Remember that, by default, a new SQL Server instance has no
remote connections? network protocols enabled. You will not be able to make
connections to a database engine instance over Named Pipes
or TCP/IP until you enable the protocol using SQL Server
Configuration Manager.
Is the client configured to use the Check that the network protocol that the client is attempting
right protocol and settings? to use to connect to the database engine instance is enabled
on the server. The client protocol will typically be defined in
the client application connection string. Server settings can
be viewed using SQL Server Configuration Manager.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 14-13
Test Action
Is the SQL Server Browser service If you are using named instances of SQL Server, the client
running for named instances that systems need to be able to resolve the names of the
are not using fixed ports? instances to a port number. By default, this is achieved by
connecting to the SQL Server Browser service on UDP port
1434. Check whether the SQL Server Browser service is
running, and whether database engine instances are
configured to use dynamic TCP/IP ports, through SQL Server
Configuration Manager. Be aware that, even if the SQL Server
Browser service is running, individual database engine
instances can be configured not to be advertised by the SQL
Server Browser service using the HideInstance flag in the
server network configuration.
If you do not want to run the SQL Server Browser service,
assign fixed ports to named instances, and consider
configuring client aliases.
Are instance aliases correctly If an alias is configured for a database engine instance,
configured? confirm that it exists for the network library used by the
client application. Remember that aliases are defined
separately for 32-bit and 64-bit clients. Check alias
configuration using SQL Server Configuration Manager.
Is a firewall blocking connections? Check to make sure that there is no firewall blocking the
ports that you are trying to connect over; this might be a
network firewall, or the Windows firewall on the client or
server. If a firewall is blocking your access, an exception or
rule will likely need to be configured in the firewall.
Best Practice: When you suspect a network connectivity problem in a TCP/IP network, start
by testing a connection from the client to the server using the server IP address—and, if
necessary, the TCP port number. Should a connection by IP address fail, the issue is likely to be
with the network—perhaps a routing or firewall problem. If a connection by IP address is
successful but a connection by name fails, the issue is likely to be with name resolution—either
DNS, the SQL Server Browser service, or SQL Server aliases.
For more information about configuring SQL Server networking, see the topic Server Network
Configuration in the SQL Server Technical Documentation:
o Make sure that the database engine instance is configured for SQL Server authentication. When
SQL Server is configured for Windows authentication only, SQL Server logins can still be created
and enabled, but cannot be used to connect to the instance. The most common error is returned
when a SQL Server login is used but, if SQL Server authentication is disabled, this indicates that a
trusted connection is not available.
o Make sure that the login has not been locked out by an account policy. If a SQL Server login is
configured to use a password expiry policy—but the client application is not configured to
handle the exchange of messages required to change an expired policy—the application will stop
working when the login’s password expires. This situation is common with applications that were
designed for versions of SQL Server that do not implement account policy for SQL Server logins.
For both Windows and SQL Server logins, make sure that the login has permission to connect to SQL
Server, in addition to being able to access the database it is attempting to connect to. This check should
include the default database for the login.
If a login problem is happening with a large number of different users, check for a failing logon trigger.
When a logon trigger prevents users from connecting, the error message that is returned to interactive
users indicates that a logon trigger prevented the connection.
When a login issue occurs, the error message returned to the client application is often generic, indicating
that a login issue occurred, but not showing the details of the problem. A good example of this is SQL
Server error number 18456, which is raised for a number of different login issues, and returns the
following error message to the client application:
Login failed for user '<user_name>'. (Microsoft SQL Server, Error: 18456)
This is a security measure, designed to limit the leakage of information about server logins to an attacker,
but this can make troubleshooting more difficult. In most instances of a login problem, a message is
written to the SQL Server error log containing more information about the problem. In the case of error
number 18456, the State property of the error message written to the SQL Server error log indicates the
cause of the error. A detailed error message is also written to the log.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 14-15
For more information about the State values returned with error number 18456, see the topic
MSSQLSERVER_18456 in the SQL Server Technical Documentation:
MSSQLSERVER_18456
http://aka.ms/n2n2j6
Objectives
At the end of the lab, you will be able to troubleshoot and resolve:
Service issues
Performance issues
Password: Pa55w.rd
Results: After this exercise, you will have investigated and resolved a SQL login issue; the PromoteApp
login will be functioning properly.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 14-17
The report server cannot open a connection to the report server database. A
connection to the database is required for all requests and processing.
(rsReportServerDatabaseUnavailable)
For more information about this error navigate to the report server on the local
server machine, or enable remote errors
This instance of Reporting Services uses databases hosted by the MIA-SQL\SQL2 database engine
instance.
2. Use Event Viewer to check the Windows system log for messages relating to the MIA-SQL\SQL2
service.
Resolve the Issue
1. The network administrator has notified you that the password for the
ADVENTUREWORKS\ServiceAcct was recently changed to Pa55w.rd. This information might help
you to resolve the problem.
Sqlcmd: Error: Microsoft ODBC Driver 11 for SQL Server : Login failed for user
'Adventureworks\AnthonyFrizzell'
2. Use the SQL Server error log to try and determine the cause of the user’s connectivity problem.
3. What should the user do to resolve the connectivity problem?
Results: At the end of this exercise, you will be able to explain to the user why he cannot connect to the
database.
Results: After this exercise, you will have investigated and resolved a job execution issue.
2. Determine the cause of the issue and resolve it. When the issue is resolved, two of the command
prompt windows that were started in the first task of this exercise will close.
If you are uncertain how to proceed, check Activity Monitor for the MIA-SQL database engine
instance.
3. Once the issue is resolved, close all the application windows opened on 20764C-MIA-SQL.
Results: At the end of this exercise, you will have resolved a performance issue.
Question: What tools might you use to monitor an intermittent or long-term issue?
MCT USE ONLY. STUDENT USE PROHIBITED
14-20 Troubleshooting SQL Server
Review Question(s)
Question: How do you rate your troubleshooting skills? What could you do to improve
them?
Tools
Participating in online forums, where developers and administrators post questions about SQL Server
issues, is a good way to practice your troubleshooting skills, and to gain insight into methods used by
other troubleshooters.
MCT USE ONLY. STUDENT USE PROHIBITED
15-1
Module 15
Importing and Exporting Data
Contents:
Module Overview 15-1
Lesson 1: Transferring Data to and from SQL Server 15-2
Module Overview
While a great deal of data residing in a Microsoft® SQL Server® system is entered directly by users who are
running application programs, there is often a need to move data in other locations, to and from SQL
Server.
SQL Server provides a set of tools you can use to transfer data in and out. Some of these tools, such as the
bcp (Bulk Copy Program) utility and SQL Server Integration Services, are external to the database engine.
Other tools, such as the BULK INSERT statement and the OPENROWSET function, are implemented in the
database engine. With SQL Server, you can also create data-tier applications that package all the tables,
views, and instance objects associated with a user database into a single unit of deployment.
In this module, you will explore these tools and techniques so that you can import and export data to and
from SQL Server.
Objectives
After completing this lesson, you will be able to:
Lesson 1
Transferring Data to and from SQL Server
The first step in learning to transfer data in and out of SQL Server is to become familiar with the processes
involved, and with the tools that SQL Server provides to implement data transfer.
When large amounts of data have to be inserted into SQL Server tables, the default settings for
constraints, triggers, and indexes are not likely to provide the best performance possible. You might
achieve improved performance by controlling when the checks that are made by constraints are carried
out and when the index pages for a table are updated.
Another approach is to load data into a staging or work table, process it, then move the processed data
into its final location. Partition switching offers one method of achieving this.
Lesson Objectives
After completing this lesson, you will be able to:
Describe core data transfer concepts.
Describe the tools that SQL Server provides for data transfer.
Transforming the data in some way to make it suitable for the target system.
Together, these three steps are commonly referred to as an Extract, Transform, Load (ETL) process, which
can be implemented by the use of ETL tools.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-3
Extracting Data
While there are other options, extracting data typically involves executing queries on a source system to
retrieve the data, or opening and reading source files.
To avoid excessive impact on the source system. For example, do not read entire tables of data when
you only have to read selected rows or columns. Also, do not continually re-read the same data, and
avoid the execution of statements that block users of the source system in any way.
To ensure the consistency of the data extraction. For example, do not include one row from the
source system more than once in the output of the extraction.
Transforming Data
The transformation phase of an ETL process will generally involve several steps, such as the following:
Data might have to be cleansed. For example, you might want to remove erroneous data, eliminate
duplicates, or provide default values for missing columns.
Lookups might have to be performed. For example, the input data might include the name of a
customer, but the database might need an ID for the customer.
Data might have to be aggregated. For example, the input data might include every transaction that
occurred on a given day, but the database might require only daily summary values.
Data might have to be de-aggregated. This is often referred to as data allocation. For example, the
input data might include quarterly budgets, but the database might require daily budgets.
In addition to these common operations, data might have to be restructured in some way—for example,
by pivoting the data so that columns become rows, concatenating multiple source columns into a single
column, or splitting a single source column into multiple columns.
Note: Data transformation is often most complex when you do not have control over the
data extraction phase—for example, when data files are provided to you in a fixed format by a
third party.
Loading Data
After data is transformed into an appropriate format, you can load it into the target system. Instead of
performing row-by-row insert operations for the data, you can use special options for loading data in
bulk. Additionally, you can make temporary configuration changes to improve the performance of the
load operation.
MCT USE ONLY. STUDENT USE PROHIBITED
15-4 Importing and Exporting Data
BULK INSERT
You can use the BULK INSERT Transact-SQL statement to import data directly from an operating system
data file into a database table. Although the configuration options for BULK INSERT and bcp are similar,
BULK INSERT differs from bcp in a number of ways:
You execute the BULK INSERT statement from within Transact-SQL, whereas the bcp utility is a
command-line utility.
While the bcp utility can be used for both import and output, the BULK INSERT statement can only
be used for data import.
OPENROWSET (BULK)
OPENROWSET is a table-valued function that you can use to connect to and retrieve data from OLE-DB
data sources. Full details of how to connect to the data source must be provided as parameters to the
OPENROWSET function. You can use OPENROWSET to connect to other database engines and data
providers for which a driver is installed on the Windows® installation where SQL Server is installed.
A special OLE-DB provider called BULK is available for reading data from text files with the OPENROWSET
function.
With the BULK provider, you can import entire documents from the file system.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-5
For a further overview of tools for bulk data transfer into and out of SQL Server, see the topic
Bulk Import and Export of Data (SQL Server) in the SQL Server Technical Documentation:
For example, consider a FOREIGN KEY constraint that ensures that the relevant customer does exist
whenever a customer order is inserted into the database. While you could check this reference for each
customer order, it is possible that a customer may have thousands of orders, resulting in thousands of
checks. Instead of checking each value as it is inserted, you can check the customer reference as a single
lookup after the overall import process completes—to cover all customer orders referring to that
customer.
In the way that avoiding lookups for FOREIGN KEY constraints during data import can improve
performance, avoiding constant updating of indexes can have a similar effect. In many cases, rebuilding
the indexes after the import process is complete is much faster than updating the indexes as the rows are
imported. The exception to this situation is when there is a much larger number of rows already in the
table than are being imported.
Triggers are commands that are executed when data is modified. Triggers operate on batches of rows—
they do not fire once per inserted row, but instead once for each batch. It is important to decide if the
processing that the triggers perform would also be better processed in bulk after the import; if you
enforce business rules or initiate processes using triggers, you might not be able to disable them during
bulk import without a lot of additional work when the import is complete.
Bulk operations that insert data into columnstore tables do not benefit from table level locking, because
each worker in a parallelized or concurrent bulk insert operation adds data to its own columnstore
rowgroup.
For more information on bulk loading into columnstore tables, see the topic Columnstore Indexes Data
Loading in the SQL Server Technical Documentation:
Not all commands can use minimal logging. While not an exhaustive list, the items below indicate the
types of restrictions that must be met to apply minimal logging:
The table is not an article in a replication publication.
If the table has no clustered index but has one or more nonclustered indexes, data pages are always
minimally logged. How index pages are logged, however, depends on whether the table is empty.
If the table has a clustered index and is empty, both data and index pages are minimally logged.
If a table has a clustered index and is nonempty, data pages and index pages are both fully logged,
regardless of the recovery model.
For more information about SQL Server recovery models, see the topic Recovery Models (SQL Server) in
the SQL Server Technical Documentation:
For more information about the prerequisites for minimally logged operations, see the topic Prerequisites
for Minimal Logging in Bulk Import in the SQL Server Technical Documentation:
Disable an Index
ALTER INDEX idx_emailaddress ON dbo.Customer DISABLE;
You can disable all of the indexes on a table, as shown in the following code example:
Note: A clustered index defines how data in a table is physically ordered. If the clustered
index is disabled, the table becomes unusable until the index is rebuilt.
The major advantage of disabling an index—instead of dropping it—is that you can put the index back
into operation by using a rebuild operation. When you rebuild an index, you do not have to know details
of the index definition. This makes it easier to create administrative scripts that stop indexes being
updated while large import or update operations are taking place—and that put the indexes back into
operation after those operations have completed.
For more information about disabling indexes, see the topic Disable Indexes and Constraints in the SQL
Server Technical Documentation:
Rebuilding Indexes
After data has been imported, you can enable indexes again. To enable a disabled index, you must rebuild
it. You can rebuild the indexes on a table by using the graphical tools in SSMS or by using the ALTER
INDEX Transact-SQL statement or the DBCC DBREINDEX command.
The following code example shows how to rebuild the idx_emailaddress on the dbo.Customer table:
Rebuild an Index
ALTER INDEX idx_emailaddress ON dbo.Customer REBUILD;
You can also use the ALL keyword with the ALTER INDEX statement to rebuild all indexes on a specified
table—similar to disabling an index.
MCT USE ONLY. STUDENT USE PROHIBITED
15-8 Importing and Exporting Data
Note: In addition to enabling a disabled index, you might rebuild an index as part of
regular index maintenance.
An index rebuild can be minimally logged, if the database is using the BULK_LOGGED or SIMPLE
recovery models.
For more information on rebuilding indexes, see the topic ALTER INDEX (Transact-SQL) in the SQL Server
Technical Documentation:
Recreating Indexes
In some circumstances, the cost of rebuilding an index after it has been disabled may be greater than
dropping and recreating the index. To recreate an index, you must know the index definition.
To recreate an index, replacing the existing one, you can use the CREATE INDEX statement with the
DROP_EXISTING option as shown in the following example:
Recreate an Index
CREATE INDEX idx_emailaddress ON dbo.Customer(EmailAddress)
WITH (DROP_EXISTING);
To disable a primary key or unique constraint, you must first disable the index that is associated with the
constraint. When you re-enable the constraint, the associated indexes are automatically rebuilt. If
duplicate values are found during the rebuild, the re-enabling of the constraint will fail. For this reason, if
you disable these constraints while importing data, you must be sure that the data being imported will
not violate the rules that the constraints enforce.
Note: Disabling primary key and unique constraints is only possible for nonclustered
constraints. If a table has a primary key or unique constraint enforced with a clustered index,
disabling the index associated with the constraint prevents access to any data in the table.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-9
When you disable a primary key constraint, any foreign key constraints that reference the disabled
primary key are also disabled.
Best Practice: In general, you should not disable primary key or unique constraints during
bulk load operations without very good reason. Both constraint types are critical to the integrity
of your data; it is likely to be easier to prevent invalid data from being loaded—by leaving
primary key and unique constraints in place—than it would be to correct it after a bulk load.
You can use check constraints to limit the values that can be contained in a column or the relationship
between the values in multiple columns in a table.
You can disable both foreign key and check constraints by using the NOCHECK option of the ALTER
TABLE statement:
You can also enable foreign key and check constraints with the ALTER TABLE statement with the CHECK
option. When you specify the CHECK option alone:
The check constraint is only applied to new data added to the table after the constraint is re-enabled.
The constraint is marked as untrusted in the system metadata, and is ignored for query plan
generation.
To force all existing data to be checked against the constraint, use the WITH CHECK CHECK option:
Using the WITH CHECK CHECK option will mark the constraint as trustworthy. For a large table, checking
all existing data against a constraint can be an expensive and time-consuming operation, which might
generate many thousands of logical reads.
Note: The bcp and BULK INSERT transfer tools both ignore check and foreign key
constraints by default when importing data, which causes the constraints to be marked as
untrusted. Use the CHECK_CONSTRAINT option with these tools to force check and foreign key
constraints to be tested for bulk-inserted data.
MCT USE ONLY. STUDENT USE PROHIBITED
15-10 Importing and Exporting Data
Demonstration Steps
1. Ensure that the MT17B-W2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines are
running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the password
Pa55w.rd.
2. Start SQL Server Management Studio and connect to your Azure instance running the
AdventureWorksLT database, using SQL Server authentication.
6. Execute the code under the heading for Step 2 to create two tables for this demonstration.
7. Execute the code under the heading for Step 3 to show the current state of the check constraint.
8. Execute the code under the heading for Step 4 to disable the check constraint.
9. Execute the code under the heading for Step 5 to show that the check constraint is marked as
disabled and untrusted.
10. Execute the code under the heading for Step 6 to enable the check constraint with CHECK.
11. Execute the code under the heading for Step 7 to show that the constraint is enabled and marked
untrusted.
12. Execute the code under the heading for Step 8 to enable the check constraint WITH CHECK CHECK.
13. Execute the code under the heading for Step 9 to show that the constraint is enabled and trusted.
14. Execute the code under the heading for Step 10 to disable a nonclustered primary key.
15. Execute the code under the heading for Step 11 to show the state of the indexes on the table.
16. Execute the code under the heading for Step 12 to demonstrate that data can still be inserted into
the table.
17. Execute the code under the heading for Step 13 to enable the index.
18. Execute the code under the heading for Step 14 to show the state of the indexes on the table.
19. Execute the code under the heading for Step 15 to disable a clustered primary key constraint. Note
the warning messages generated by this command.
20. Execute the code under the heading for Step 16 to show that all the indexes on the table are
disabled.
21. Execute the code under the heading for Step 17 to enable the clustered index.
22. Execute the code under the heading for Step 18 to show that the nonclustered index remains
disabled.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-11
23. Execute the code under the heading for Step 19 to enable the nonclustered index.
24. Execute the code under the heading for Step 20 to enable the foreign key constraint that references
the clustered primary key.
25. Execute the code under the heading for Step 21 to drop the demonstration objects.
Partition Switching
Partition switching is a feature designed to assist
with the management of data in partitioned
tables— typically for bulk loading or for
archiving. You can use partition switching to
swap a partition in one table with a partition in
another table. Because the switch operation
requires only an update of metadata, it
completes almost instantly, regardless of the
amount of data in the partitions being switched.
To switch partitions, use the ALTER TABLE… SWITCH statement. The following statement switches partition
four of the partitioned table dbo.FX_rate with the single partition of the unpartitioned
dbo.FX_rate_staging table:
Partition Switching
ALTER TABLE dbo.FX_rate_staging SWITCH TO dbo.FX_rate PARTITION 4;
You might use partition switching as a method of loading data into a partitioned table while maximizing
the time the table is accessible to users.
This means you can use partition switching on editions of SQL Server that do not support partitioning.
A partition switch with unpartitioned tables also uses ALTER TABLE…SWITCH. The following statement
switches the unpartitioned table dbo.FX_rate_2 with the unpartitioned dbo.FX_rate_2_staging table:
For more information on partition switching, see the topic ALTER TABLE (Transact-SQL) in the SQL Server
Technical Documentation:
Demonstration Steps
1. In Solution Explorer, open the query file Demo 02 - partition switch.sql.
3. Execute the code under the heading for Step 2 to create a partition function, partition scheme and
partitioned table.
4. Execute the code under the heading for Step 3 to create and add data to the unpartitioned table that
matches the schema of the partitioned table.
5. Execute the code under the heading for Step 4 to switch partition one of the partitioned table with
the unpartitioned table.
6. Execute the code under the heading for Step 5 to demonstrate the effect of the switch.
7. Execute the code under the heading for Step 6 to create three identical unpartitioned tables, and add
data to SalesLT.ShippingRate.
8. Execute the code under the heading for Step 7 to add data to SalesLT.ShippingRateStaging,
representing a data load.
9. Execute the code under the heading for Step 8 to switch partitions. Note the use of the third table so
that one of the participants in a switch is always empty.
10. Execute the code under the heading for Step 9 to demonstrate the effect of the switch.
11. Execute the code under the heading for Step 10 to drop the demonstration objects.
12. Leave SSMS open for the next demonstration.
Lesson 2
Importing and Exporting Table Data
This lesson begins an exploration of the tools and techniques available in SQL Server for importing and
exporting data. In this lesson, you will learn about linked servers, SQL Server Integration Services, and the
SQL Server Data Import and Export Wizard.
Lesson Objectives
After completing this lesson, you will be able to:
Linked Servers
Linked servers provide a method for you to
execute commands against remote OLE DB data
sources that you access regularly from within a
SQL Server instance. You can create a linked
server for any data source for which a suitable
OLE DB driver is installed on the operating
system hosting your SQL Server instance,
including other instances of SQL Server.
The definition for a linked server data source will typically include a security context under which the
connection to the remote OLE DB data source should be made. You may configure this security context in
different ways, including using a single username for all connections, or mapping specific SQL Server
logins to logins on the remote data source.
For more information on linked servers, see the topic Linked Servers (Database Engine) in the SQL Server
Technical Documentation:
The details of linked server configuration vary by OLE DB provider; different OLE DB providers require—
and support—different configuration options.
For more information on creating linked servers, see the topic Create Linked Servers (SQL Server Database
Engine) in the SQL Server Technical Documentation:
For more information on creating linked servers from Transact-SQL commands, including a table of
configuration values for common OLE DB providers, see the topic sp_addlinkedserver (Transact-SQL) in the
SQL Server Technical Documentation:
sp_addlinkedserver (Transact-SQL)
https://aka.ms/Ow9493
Four-part naming.
OPENQUERY.
Four-Part Naming
When you use four-part naming, you refer to a table or view on a linked server in the body of a Transact-
SQL statement, almost as if it were part of the local database—the difference being that you qualify the
object name with four parts:
Four-Part Naming
SELECT p.PartId, m.manufacture_date
FROM Production.Part AS p
JOIN Manufacturing.mf1.prodution.job AS m
ON m.part_identifier = p.PartId;
Note: When referenced using four-part names, the performance of linked server queries
can be unpredictable. In some circumstances, SQL Server might attempt to apply a filter or a join
to a linked server table by retrieving the entire contents of the remote table and filtering it
locally. If the remote table is very large, this can be a time-consuming operation.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-15
OPENQUERY
The OPENQUERY command is used to execute a command against a linked server as a pass-through
operation, as if SQL Server were a client application of the remote data source. The command might be
any valid command in the query language supported by the remote data source; if the remote data
source uses a different dialect of SQL than Transact-SQL, commands issued with OPENQUERY must be in
the remote data source dialect.
The following example uses OPENQUERY to return data from the Manufacturing linked server:
OPENQUERY SELECT
SELECT * FROM OPENQUERY(Manufacturing, 'SELECT TOP 10 manufacture_date FROM
mf1.prodution.job WHERE part_identifier = 1624 ORDER BY manufacture_date DESC');
You may also issue DML commands against the result set returned by OPENQUERY, if the OLE DB
provider supports it.
The following example uses OPENQUERY to update a row in a table on the Manufacturing linked server:
OPENQUERY UPDATE
UPDATE OPENQUERY (Manufacturing, 'SELECT cancel_job FROM mf1.prodution.job WHERE id =
1208')
SET cancel_job = 'Y';
For more information on using OPENQUERY, see the topic OPENQUERY (Transact-SQL) in the SQL Server
Technical Documentation:
OPENQUERY (Transact-SQL)
https://aka.ms/Xw7tts
SSIS Designer. A graphical design interface for developing SSIS solutions in the Microsoft Visual
Studio® development environment. Typically, you start the SQL Server Data Tools (SSDT) application
to access this.
MCT USE ONLY. STUDENT USE PROHIBITED
15-16 Importing and Exporting Data
Wizards. Graphical utilities you can use to quickly create, configure, and deploy SSIS solutions.
Command-line tools. Utilities you can use to manage and execute SSIS packages.
An SSIS solution usually consists of one or more SSIS projects, each containing at least one SSIS package.
SSIS Projects
In SQL Server, a project is the unit of deployment for SSIS solutions. You can define project-level
parameters so that users can specify run-time settings, and project-level connection managers that
reference data sources and destinations used in package data flows. You can then deploy projects to an
SSIS catalog in a SQL Server instance, and configure project-level parameter values and connections as
appropriate for execution environments. You can use SSDT to create, debug, and deploy SSIS projects.
SSIS Packages
A project contains one or more packages, each defining a workflow of tasks to be executed. The workflow
of tasks is referred to as its control flow. A package control flow can include one or more data flow tasks,
each of which encapsulates its own pipeline. You can include package-level parameters so that the
package receives dynamic values at run time. In previous SSIS releases, deployment was managed at the
package level. In SQL Server, you can still deploy individual packages in a package deployment model.
SSIS provides two utilities that you can use to run packages:
DTExec utility. You can use DTExec to run SSIS packages from the command line. You have to
specify parameters including the server to use, the location of the package, environment variables,
and input parameters. The utility reads the command-line parameters, loads the package, configures
the package options based on the parameters passed, and then runs the package. It returns an exit
code signifying the success or failure of the package.
DtExecUI utility. The Execute Package Utility (DtExecUI) can run SSIS packages from SQL Server
Management Studio (SSMS) or from the command prompt and is a GUI for the DTExec command
prompt utility. The GUI simplifies the process of passing parameters to the utility and receiving exit
codes.
For more information on SSIS, see the topic SQL Server Integration Services in the SQL Server Technical
Documentation:
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
4. Start Visual Studio, and open the SSISProject.sln solution in the D:\Demofiles\Mod15\SSISProject
folder.
5. Give a brief demonstration of the different areas of an SSIS project in Visual Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-17
8. Click and drag the Data Flow Task from the SSIS Toolbox pane to the Package.dtsx [Design] pane.
Release the Data Flow Task anywhere within the Control Flow tab of the Package.dtsx [Design]
pane.
9. Right-click Data Flow Task, click Rename, type Top Level Domain Name Import, and then press
Enter.
10. Double-click Top Level Domain Name Import to go to the Data Flow tab of the designer.
11. Click and drag Source Assistant from the SSIS Toolbox pane to the Data Flow tab of the
Package.dtsx [Design] pane.
12. In the Source Assistant - Add New Source dialog box, in the Select source type box, click Flat File.
14. In the Flat File Connection Manager Editor dialog box, on the General page, in the Connection
Manager Name box, type TLD File.
17. Clear the Column names in the first data row check box.
18. On the Columns page, examine the preview to ensure that two columns are shown.
19. On the Advanced page, for Column 0, change the OutputColumnWidth to 100, and then click OK.
20. Right-click Flat File Source, click Rename, type TLD File Source, and then press Enter.
21. Click and drag Destination Assistant from the SSIS Toolbox pane to the Data Flow tab.
22. In the Destination Assistant - Add New Destination dialog box, confirm that Select destination
type is SQL Server.
23. In the Select Connection managers box, click New, and then click OK.
24. In the Connection Manager dialog box, in the Server name box, type MIA-SQL.
25. In the Select or enter a database name list, click salesapp1, and then click Test Connection.
26. In the Connection Manager dialog box, note the test was successful, and then click OK.
28. Right-click OLE DB Destination, click Rename, type salesapp1 DB, and then press Enter.
29. Click TLD File Source, then click the left (blue) arrow on the bottom of the TLD File Source object,
and then click the salesapp1 DB object.
31. In the OLE DB Destination Editor dialog box, on the Connection Manager page, in the Name of
the table or the view list, click [dbo].[TopLevelDomain]. Point out the Table Lock and Check
constraints check boxes and relate them back to the previous lesson.
32. On the Mappings page, in the first Input Column box, click Column 0.
33. In the second Input Column box, click Column 1, and then click OK.
35. When the package has completed, on the Debug menu, click Stop Debugging.
MCT USE ONLY. STUDENT USE PROHIBITED
15-18 Importing and Exporting Data
36. In SQL Server Management Studio, open the query file Demo 03 - SSIS.sql.
37. On the Query menu, point to Connection, and then click Change Connection,
38. In the Connect to Database Engine dialog box, in Server name box, type MIA-SQL, and in the
Authentication list, click Windows Authentication, and then click Connect.
39. Execute the query in the file to view the uploaded contents of the dbo.TopLevelDomain table.
40. Close Visual Studio without saving changes. Leave SSMS open for the next demonstration.
Note: On a 64-bit computer, SSIS setup installs the 64-bit version of the Import and Export
Wizard. However, some data sources might only have 32-bit providers. To use these data sources,
you must install the 32-bit version of the Data Import and Export Wizard. Selecting the
Management Tools - Complete option during installation installs both the 32-bit and 64-bit
versions of the wizard.
You can use the wizard to perform the data transfer immediately, or you can save the SSIS package it
generates for execution at a later time. You can edit an SSIS package generated by the Import and Export
Wizard using SSDT.
The SQL Server Import and Export Wizard can be accessed from the Windows Start menu, from SSDT, or
through SSMS. You can only start the 32-bit version of the wizard from SSMS. If they are installed, the
Start menu will contain links to both the 32-bit and 64-bit versions of the wizard.
For more information on the SQL Server Import and Export Wizard, see the topic SQL Server Import and
Export Wizard in the SQL Server Technical Documentation:
Demonstration Steps
1. Ensure that the MT17B-WS2016-NAT, 20764C-MIA-DC, and 20764C-MIA-SQL virtual machines
are running, and then log on to 20764C-MIA-SQL as ADVENTUREWORKS\Student with the
password Pa55w.rd.
2. In SQL Server Management Studio, in Object Explorer, click connect, and then click Database
Engine.
3. In the Connect to Database Engine dialog box, in the Server name box, type MIA-SQL, and then
click Connect.
4. In Object Explorer, under MIA-SQL, expand Databases, right-click salesapp1, point to Tasks, and
then click Export Data.
5. In the SQL Server Import and Export Wizard window, click Next.
6. On the Choose a Data Source page, in the Data source box, click SQL Server Native Client 11.0.
7. Verify that the Database box has the value salesapp1, and then click Next.
8. On the Choose a Destination page, in the Destination box, click Flat File Destination.
9. In the File name box, type D:\Demofiles\Mod15\export.txt, and then click Next.
10. On the Specify Table Copy or Query page, verify that Copy data from one or more tables or
views is selected, and then click Next.
11. On the Configure Flat File Destination page, in the Source table or view list, click
[Production].[Categories], and then click Next.
12. On the Save and Run Package page, verify that Run immediately is selected, and then click Finish.
13. On the Complete the Wizard page, click Finish to run the export.
15. Using File Explorer, open D:\Demofiles\Mod15\export.txt to verify the result of the export.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Lesson 3
Using bcp and BULK INSERT to Import Data
This lesson continues an exploration of the tools and techniques available in SQL Server for importing and
exporting data. In this lesson, you will learn about bcp, BULK INSERT, and OPENROWSET.
Lesson Objectives
After completing this lesson, you will be able to:
bcp Syntax
The syntax for the bcp utility is versatile, and
includes a large number of options. The general
form of a bcp command specifies:
A table or view in a SQL Server database,
which will be the source for data export or the target for data import.
A direction (in when importing data into SQL Server; out when exporting data from SQL Server).
A local file name for the source (when importing) or destination (when exporting).
You can also use the queryout direction to specify that data is to be extracted from the database based
on a Transact-SQL query. Additionally, the bcp utility supports the following commonly-used arguments.
Note that the arguments are case-sensitive:
-d. The database containing the table or view (you can also specify a fully-qualified table or view
name that includes the database and schema—for example, AdventureWorks.Sales.Currency).
-T. Specifies that a trusted connection should be used to connect using Windows authentication.
-c. Specifies that the data file stores data in character format.
-w. Specifies that the data file stores data in wide (Unicode) character format.
-n. Specifies that the data file stores data in SQL Server native format.
-f format_file. Specifies a format file that defines the schema for the data.
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-21
-t delimiter: Specifies a field terminator for data in character format. The default is a tab.
-r delimiter: Specifies a row terminator for data in character format. The default is a new line.
Note: By default, bcp will use column default values when inserting to columns for which
no value is specified. Use the -k argument to override this behavior and insert NULL instead of
using defaults.
For more information in handling NULL in bcp operations, see the topic Keep Nulls or Use Default Values
During Bulk Import (SQL Server) in the SQL Server Technical Documentation:
Keep Nulls or Use Default Values During Bulk Import (SQL Server)
https://aka.ms/O2popc
If you are using bcp to transfer data between instances of SQL Server, you should use the -n argument to
specify native formatting. In native format, bcp files occupy less space than equivalent character or wide
character files; using the native format means that you do not have to specify field and row delimiters.
The following example connects to the MIA-SQL SQL Server instance using Windows authentication, and
exports the contents of the Sales.Currency table in the AdventureWorks database to a text file named
Currency.csv, in which the data is saved in comma-delimited character format with a new line for each
row:
To pre-emptively create a format file, use the format nul direction and specify the name of the format file
you want to create. You can then interactively specify the data type, prefix length, and delimiter for each
field in the specified table or view, and save the resulting schema in the format file. The default format file
type is text, but you can use the -x argument to create an XML format file. If you want to create a format
file for character data with specific field and row terminators, you can specify them with the -c, -t, and -r
argument.
MCT USE ONLY. STUDENT USE PROHIBITED
15-22 Importing and Exporting Data
The following example shows how to use bcp to create an XML-based format file named
CurrencyFmt.xml based on the AdventureWorks.Sales.Currency table:
To use a format file when importing or exporting data, use the -f argument.
The following example shows how to import the contents of Currency.csv into the
Finance.dbo.Currency table. The in parameter specifies the file to read and the -f argument specifies the
format file to use:
Note: When you use a format file to import data, empty string values in the source file will
be converted to NULL.
For more information on bcp, including a full description of accepted arguments, see the topic bcp Utility
in the SQL Server Technical Documentation:
bcp Utility
https://aka.ms/Ewe8lb
Demonstration Steps
1. Open a command prompt, type the following command, and then press Enter to view the bcp syntax
help:
bcp -?
2. At the command prompt, type the following command, and then press Enter to create a text format
file:
3. At the command prompt, type the following command, and then press Enter to create an XML format
file:
5. At the command prompt, type the following command, and then press Enter to export data using the
XML format file:
7. Using Notepad, open the D:\Demofiles\Mod15\bcp\Employees.csv file and view the data that has
been exported. Note that the commas in several of the data fields make this data unsuitable for
export using a comma as a field delimiter. Close Notepad when you have finished reviewing.
A key consideration for using the BULK INSERT statement is that file paths to source data must be
accessible from the server where the SQL Server instance is running, and must use the correct drive letters
for volumes as they are defined on the server. For example, when running a BULK INSERT statement from
a client computer, the path C:\data\file.txt references a file on the C: volume of the server, not the client.
Unlike bcp, you can execute the BULK INSERT statement from within a transaction you control, which
gives the ability to group BULK INSERT with other operations in a single transaction. However, you should
take care to ensure that the size of the data batches that you import within a single transaction are not
excessive or significant log file growth might occur—even when the database is in simple recovery mode.
In the following example, new orders are inserted into the Sales.OrderDetail table from a text file on the
file system:
Like bcp, BULK INSERT supports the use of format files to define the data types of the source data file.
MCT USE ONLY. STUDENT USE PROHIBITED
15-24 Importing and Exporting Data
Note: The BULK INSERT statement is not supported on Azure SQL Database. Use SSIS or
bcp to bulk-load data into Azure SQL Database.
For more information on the BULK INSERT statement, see the topic BULK INSERT (Transact-SQL) in the
SQL Server Technical Documentation:
Demonstration Steps
1. In SQL Server Management Studio, open the query file Demo 06 - BULK INSERT.sql.
2. Execute the code under the heading for Step 1 to demonstrate that the Finance.dbo.Currency table
is empty.
3. Execute the code under the heading for Step 2 to run a BULK INSERT statement to load
Finance.dbo.Currency with data.
4. Execute the code under the heading for Step 3 to verify that the table has been loaded with data.
5. Leave SSMS open for the next demonstration.
Note: The result set returned by OPENROWSET must be given an alias in the FROM clause,
using the AS keyword. In the previous example, the alias is rows.
As with the BULK INSERT statement, file paths used with the OPENROWSET function refer to volumes that
are defined on the server.
Two key advantages of OPENROWSET when compared to bcp or BULK INSERT are that:
OPENROWSET can be used in a query with a WHERE clause (to filter the rows that are loaded).
OPENROWSET can be used in a SELECT statement that is not necessarily associated with an INSERT
statement.
SINGLE_CLOB. This option reads an entire single-byte character-based file as a single value of data
type varchar(max).
SINGLE_NCLOB. This option reads an entire double-byte character-based file as a single value of data
type nvarchar(max).
SINGLE_BLOB. This option reads an entire binary file as a single value of data type varbinary(max).
In the following example, the data in the SignedAccounts.pdf file is inserted into the Document column
of the dbo.AccountsDocuments table:
Note: To use OPENROWSET with OLE-DB providers other than BULK, the ad hoc
distributed queries system configuration option must be enabled.
The OLE DB provider must also be configured to provide ad hoc access—this is normally enabled
by default, but may be disabled by an administrator. This provider setting can be configured
though the properties of the provider under Providers node, under Server Objects, in the SSMS
Object Explorer. The Disallow adhoc access setting must not be selected to allow the provider
to be used in ad hoc queries. Alternatively, you can use the sp_MSset_oledb_prop system stored
procedure to set the value of the DisallowAdHocAccess provider setting.
Note: OPENROWSET is not supported on Azure SQL Database. Use SSIS or bcp to bulk-
load data into Azure SQL Database.
For more information on using OPENROWSET, see the topic OPENROWSET (Transact-SQL) in the SQL
Server Technical Documentation:
OPENROWSET (Transact-SQL)
https://aka.ms/Xmtpzp
MCT USE ONLY. STUDENT USE PROHIBITED
15-26 Importing and Exporting Data
Demonstration Steps
1. In SQL Server Management Studio, open the query file Demo 07 - OPENROWSET.sql.
2. Execute the code under the heading for Step 1 to demonstrate that the Finance.dbo.SalesTaxRate
table is empty.
3. Execute the code under the heading for Step 2 to demonstrate a SELECT statement using the
OPENROWSET BULK provider.
4. Execute the code under the heading for Step 3 to demonstrate that the output of an OPENROWSET
statement can be filtered with a WHERE clause.
5. Execute the code under the heading for Step 4 to use an OPENROWSET statement to insert data into
the Finance.dbo.SalesTaxRate table.
6. Execute the code under the heading for Step 5 to demonstrate that the Finance.dbo.SalesTaxRate
table now contains data.
Verify the correctness of the statement by placing a mark in the column to the right.
Statement Answer
Lesson 4
Deploying and Upgrading Data-Tier Applications
Data-tier applications (DACs) provide a way to simplify the development, deployment, and management
of database applications and their SQL Server instance-level dependencies. They provide a logical
container for application databases for use in installation and upgrade.
Lesson Objectives
After completing this lesson, you will be able to:
Deploy a DAC.
Perform an in-place upgrade of a DAC.
Creating a DAC
There are two ways that a DAC might typically be created:
By application developers. Developers can define a DAC as part of the application development
process, using Visual Studio SSDT to create database object definitions required for their application.
When development is complete, the DAC is passed to administrators for deployment to production
SQL Server instances. As application requirements change and new features are added, the DAC is
MCT USE ONLY. STUDENT USE PROHIBITED
15-28 Importing and Exporting Data
updated and new DACPACs are generated so that updates can be deployed to production database
instances.
By database administrators. Database administrators (DBAs) can extract a DAC from an existing SQL
Server database. The resulting DACPAC might be used to deploy new copies of an application
database, or to deploy an application database to Azure SQL Database.
DACPACs may optionally include policies that indicate restrictions on where they can be deployed—for
instance, restricting deployment to a specific version and edition of SQL Server.
Note: Data-tier applications do not support all SQL Server objects. For example, XML
schema collections and SQL CLR based objects are not supported. For this reason, not all
databases are available for extraction to a DACPAC file. When SQL Server is unable to perform a
registration or extraction, the wizard displays the objects that are not supported.
For more information about the object types supported by DACs, see the topic DAC Support For SQL
Server Objects and Versions in the SQL Server Technical Documentation:
DAC Registration
When a database is deployed or upgraded from a DACPAC, the database is registered as a DAC with SQL
Server. When registration occurs, the DAC version and other associated metadata is recorded by the
database engine instance. DACs can be explicitly registered and unregistered using the REGISTER and
UNREGISTER actions of the DAC tools.
One of the benefits of using a DAC is that the DAC is versioned and can be upgraded across the
enterprise in a consistent, managed way. This is useful in large organizations where the same database
application might be deployed in multiple sites or virtual servers. Application administrators can easily
track which version of an application is installed in each location, and upgrade it to the latest version as
required.
BACPAC
While a DACPAC contains the definition for a database schema, it is also possible to export both the
schema and data of a DAC into a file format called BACPAC. BACPAC extends the DACPAC format to
include scripts to recreate a database, including its data.
For more information about DACs, see the topic Data-tier Applications in the SQL Server Technical
Documentation:
Data-tier Applications
https://aka.ms/Hpvzso
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-29
For more information on deploying DACs, see the topic Deploy a Data-tier Application in the SQL Server
Technical Documentation:
Deploy a Data-tier Application
https://aka.ms/T03qx1
For further information on the SqlPackage utility, including a complete list of arguments, see the topic
SqlPackage.exe on MSDN:
SqlPackage.exe
http://aka.ms/gaq8sj
A DAC may be upgraded using a DACPAC, in conjunction with any of these three tools:
SSMS Upgrade Data-Tier Application Wizard. With the upgrade wizard, you can apply a DACPAC
to an existing DAC. The wizard can facilitate some limited customization of the database to be
created from the DAC—for example, the database name, and location of data files.
PowerShell. For more complex automation than can be achieved using SqlPackage, DACs can be
upgraded using PowerShell.
The behavior of an in-place DAC upgrade may be configured using the following settings:
Ignore Data Loss. The upgrade will proceed even if data will be lost—for example, if a table is
dropped. By default, this setting is True.
Block on Changes. The upgrade will stop if the database schema would be changed by the upgrade.
By default, this setting is False.
Rollback on Failure. The upgrade is encapsulated in a transaction, which will be rolled back in the
event of an error. By default, this setting is False. When this setting is False, if an error occurs during
an in-place upgrade, the only way to restore the database to a known state will be to restore a
backup.
Skip Policy Validation. The DACPAC validation settings are bypassed. By default, this setting is False.
Best Practice: You should take a full database backup before proceeding with an in-place
upgrade of a DAC.
Each of the three deployment mechanisms—the upgrade wizard, SqlPackage, and PowerShell—can be
run in a mode whereby you can review the changes needed to complete the upgrade before they are
applied.
For more information on upgrading a DAC in-place, see the topic Upgrade a Data-tier Application in the
SQL Server Technical Documentation:
Note: Remember that you extract a database when you create a DACPAC for its schema.
You export a database when you create a BACPAC from its schema and data.
Although a DACPAC does not contain data for the complete database, you may optionally
include up to 10 MB of reference data in a DACPAC.
A DAC may be extracted using a DACPAC with any of these three tools:
SSMS Extract Data-Tier Application Wizard. With the extraction wizard, you can create a DACPAC
from an existing database. The wizard can facilitate some limited customization of the DACPAC—for
example, the application name and version number.
PowerShell. For more complex automation than can be achieved using SqlPackage, DACPACs can
be extracted using PowerShell.
For more information on extracting a DACPAC from a database, see the topic Extract a DAC From a
Database in the SQL Server Technical Documentation:
Extract a DAC From a Database
https://aka.ms/Fqoemt
Demonstration Steps
1. In SQL Server Management Studio, in Object Explorer, under MIA-SQL, expand Databases, right-
click Finance, point to Tasks, and then click Extract Data-tier Application.
3. On the Set Properties page, in the Save to DAC package file (include .dacpac extension with the
file name) box, type D:\Demofiles\Mod15\dacpac\Finance.dacpac, and then click Next.
4. On the Validation and Summary page, click Next. The extract will begin.
5. On the Build Package page, when the extraction process is complete, click Finish.
7. To import the DACPAC, in SSMS, in Object Explorer, under MIA-SQL, right-click Databases, and then
click Deploy Data-tier Application.
10. On the Update Configuration page, in the Name (the name of the deployed DAC and database)
box, type FinanceDAC, and then click Next.
11. On the Summary page click Next. The deployment will run.
MCT USE ONLY. STUDENT USE PROHIBITED
15-32 Importing and Exporting Data
12. On the Deploy DAC page, when the deployment is complete, click Finish.
13. In Object Explorer, right-click Databases, and then click Refresh. Verify that the FinanceDAC
database exists.
14. Expand FinanceDAC, expand Tables, right-click dbo.Currency, and then click Select Top 1000
Rows to verify that the table has been created with no data.
EXTRACT
UPGRADE
DEPLOY
REGISTER
EXPORT
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-33
The company receives updates of currencies and exchange rates from an external provider. One of these
files is provided as an Excel spreadsheet, the other file is provided as a delimited text file. You must import
both these files into tables that will be used by the Accounting team.
Periodically, the Marketing team requires a list of prospects that have not been contacted within the last
month. You must create and test a package that will extract this information to a file for them.
The Accounting team has purchased a new application for tracking fixed assets. The database for this
application is installed as a data-tier application. You will install the DACPAC provided by the application
developers.
Objectives
After completing this lab, you will be able to:
Use the Data Import and Export Wizard.
Password: Pa55w.rd
Install the Microsoft Access Database Engine 2016 Redistributable to enable SQL server to import
Excel and Access data.
https://aka.ms/Tjh400
2. Use the SQL Server Import and Export Wizard to load the data from
D:\Labfiles\Lab15\Starter\Import\currency_codes.xlsx into
AdventureWorks.Accounts.CurrencyCode.
3. When the import is complete, open SSMS, and then open the project file
D:\Labfiles\Lab15\Starter\Project\Project.ssmssln and the Transact-SQL file Lab Exercise 01 -
currency codes.sql.
4. Execute the code under the heading for task 1 to view the data in the Accounts.CurrencyCode table.
Note: The currency exchange rates used in this exercise are example data; they do not
correspond to the actual currency exchange rates on the dates specified.
2. Execute the query under the heading for Task 2 to permit the foreign key constraints to be trusted.
3. Execute the query under the heading for Task 1 to examine the status of the constraints on the
Accounts.ExchangeRate table again. Are the foreign key constraints trusted?
2. Verify that a prospects.txt file has been created in the D:\Labfiles\Lab15\Starter\Export directory.
Question: What alternative methods to an SSIS package might you use to export the output
of the Sales.usp_prospect_list stored procedure to a file?
When planning a data transfer solution, consider the following best practices:
Best Practice:
Review Question(s)
Question: What other factors should you consider when importing or exporting data?
MCT USE ONLY. STUDENT USE PROHIBITED
Administering a SQL Database Infrastructure 15-39
Course Evaluation
Your evaluation of this course will help Microsoft understand the quality of your learning experience.
Please work with your training provider to access the course evaluation form.
Microsoft will keep your answers to this survey private and confidential and will use your responses to
improve your future learning experience. Your open and honest feedback is valuable and appreciated.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L1-1
3. In the User Account Control dialog box, click Yes, and wait for the script to finish.
2. In Object Explorer, right-click the MIA-SQL instance, and click then Properties.
3. In the Server Properties - MIA-SQL dialog box, on the Security page, verify that SQL Server and
Windows Authentication mode is selected, and then click OK.
3. In the Login - New dialog box, on the General page, in the Login name box, type
ADVENTUREWORKS\WebApplicationSvc.
5. In the Default database list, click AdventureWorks, and then click OK.
7. In the Login - New dialog box, on the General page, click Search.
9. In the Object Types dialog box, select Groups, and then click OK.
11. In the Locations dialog box, expand Entire Directory, click adventurework.msft, and then click OK.
12. In the Select User, Service Account, or Group dialog box, in the Enter the object name to select
box, type IT_Support, click Check Names, and then click OK.
13. In the Login - New dialog box, ensure that Windows authentication is selected.
14. In the Default database list, click AdventureWorks, and then click OK.
2. In the Login - New dialog box, on the General page, in the Login name box, type SalesSupport.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-2 Administering a SQL Database Infrastructure
3. Click SQL Server authentication, and in the Password and Confirm Password boxes, type
Pa55w.rd.
4. Confirm that Enforce password policy is selected. Clear the Enforce password expiration check
box. The User must change password at next login check box will automatically be cleared.
5. In the Default database list, click AdventureWorks, and then click OK.
6. Leave SQL Server Management Studio open for the next exercise.
Results: After this exercise, you should have verified the authentication modes supported by the MIA-
SQL instance, and created three logins.
4. In the Login name box, type SalesSupport, and then click OK.
5. Under MIA-SQL, under Security, under Logins, right-click SalesSupport, and then click Properties.
6. In the Login Properties - SalesSupport dialog box, on the User Mapping page, verify that the login
is mapped to the ServiceUser user in AdventureWorks, and the default schema is dbo, and then
click OK.
USE AdventureWorks;
GO
CREATE USER [ITSupport] FOR LOGIN [ADVENTUREWORKS\IT_Support] WITH
DEFAULT_SCHEMA=[dbo]
GO
3. Click Execute.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-3
4. In Object Explorer, in the AdventureWorks database, under Security, right-click Users then click
Refresh, verify that the ITSupport user appears.
Results: At the end of this exercise, you will have created three database users and mapped them to the
logins you created in the previous exercise.
2. At the Command Prompt, type the following command, and then press Enter:
3. Notice that the error message presented to the sqlcmd is generic, reporting that login failed but
giving no further details.
4. In SQL Server Management Studio, in Object Explorer, expand Management¸ expand SQL Server
Logs, and then double-click the log file whose name begins Current.
5. In the Log File Viewer - MIA SQL dialog box, in the right-hand pane, look for the topmost log entry
that begins Login failed for user ‘LegacySalesLogin’. The error message states that there was a
problem evaluating the login’s password.
6. Notice that the next line in the log file contains the following error number:
The documentation for error 18456 indicates that a State value of 7 is caused when:
7. In the Log File Viewer - MIA SQL dialog box, click Close.
The login cannot connect because the account is disabled, and the wrong password is being used.
3. In Command Prompt, type the following command, and then press Enter:
4. Notice that the error message presented to the sqlcmd is generic, reporting that the login failed but
giving no further details.
5. In SQL Server Management Studio, in Object Explorer, under Management¸ under SQL Server Logs,
double-click the log file whose name begins Current.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-4 Administering a SQL Database Infrastructure
6. In the Log File Viewer - MIA SQL dialog box, in the right-hand pane, look for the topmost log entry
that begins Login failed for user ‘LegacySalesLogin’. Read the rest of the entry to determine the
cause of the login failure. Notice that the login failed because the password was not correct.
7. In the Log File Viewer - MIA SQL dialog box, click Close.
2. In the Login Properties - LegacySalesLogin dialog box, on the General page, in the Password and
Confirm password boxes, type t0ps3cr3t, and then click OK.
3. In Command Prompt, type the following command, and then press Enter:
4. Notice that the error message indicates that the default database cannot be opened.
2. In the Login Properties - LegacySalesLogin dialog box, on the General page, in the Default
database list, click AdventureWorks.
3. On the User Mapping page, in the Users mapped to this login section, on the row for the
AdventureWorks database, select the Map check box, and then click OK.
4. In Command Prompt, type the following command, and then press Enter:
6. Close the command prompt window, but leave SQL Server Management Studio open for the next
exercise.
o ADVENTUREWORKS\WebApplicationSvc
o InternetSalesApplication
MCT USE ONLY. STUDENT USE PROHIBITED
L1-5
USE InternetSales
EXEC sp_change_users_login 'Report';
GO
USE InternetSales
EXEC sp_change_users_login 'Auto_Fix', 'InternetSalesApplication', NULL, NULL
GO
3. Select the query you have typed, and click Execute. In the output of the query reports, notice that
one orphaned user was fixed by updating it.
USE InternetSales
EXEC sp_change_users_login 'Report';
GO
6. Select the query you have typed, and click Execute. Notice that no orphaned users are reported.
7. Close SQL Server Management Studio without saving any changes.
3. In the User Account Control dialog, click Yes when prompted to confirm that you want to run the
command file, and wait for the script to finish.
3. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have created the database_manager server role, granted
permissions to members to alter any login and alter any database, and granted membership to the
members of the Database_Managers login.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-2 Administering a SQL Database Infrastructure
3. In the Database role membership for: salesapp1 list, select the db_accessadmin and
db_backupoperator check boxes, and then click OK.
4. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have mapped the Database_Managers login to the
salesapp1 database and added them to the db_backupoperator and db_accessadmin roles.
3. In the Database User - New dialog box, on the General page, ensure that the User type box has the
value SQL user with login.
4. In the User name box, type internetsales_user, in the Login name box, type
ADVENTUREWORKS\InternetSales_Users, and then click OK.
2. In the Database Role - New dialog box, on the General page, in the Role name box, type
sales_reader, and then click Add.
4. In the Browse for Objects dialog box, select the [internetsales_user] check box, and then click OK.
6. In the Database Role - New dialog box, on the Securables page, click Search.
7. In the Add Objects dialog box, click Specific objects, and then click OK.
9. In the Select Object Types dialog box, select the Schemas check box, then click OK.
11. In the Browse for Objects dialog box, select the [Sales] check box, then click OK.
13. In the Database Role - New dialog box, in the Permissions for Sales section, on the Explicit tab, in
the Select row, select the Grant check box, and then click OK.
2. Highlight the code under the heading for Task 3 and click Execute.
2. Highlight the code you have just typed and click Execute.
Results: At the end of this exercise, you will have created user-defined database roles and assigned them
to database principals.
3. At the command prompt, type the following command (which opens the sqlcmd utility as
ADVENTUREWORKS\AnthonyFrizzell), and then press Enter:
4. When you are prompted for a password, type Pa55w.rd, and then press Enter. Wait for the
connection to succeed and the SQLCMD window to open.
5. In the SQLCMD window, at the command prompt, type the following commands to verify your
identity, and then press Enter:
SELECT SUSER_NAME();
GO
Note that SQL Server identifies Windows group logins using their individual user account, even
though there is no individual login for that user. ADVENTUREWORKS\AnthonyFrizzell is a member
of the ADVENTUREWORKS\IT_Support global group, which is in turn a member of the
ADVENTUREWORKS\Database_Managers domain local group for which you created a login.
6. In the SQLCMD window, at the command prompt, type the following commands to alter the
password of the Marketing_Application login, and then press Enter:
7. In the SQLCMD window, at the command prompt, type the following commands to disable the
ADVENTUREWORKS\WebAplicationSvc login, and then press Enter:
8. In the SQLCMD window, at the command prompt, type exit, and then press Enter.
9. In SQL Server Management Studio, in Object Explorer, under MIA-SQL, under Security, right-click
Logins, and then click Refresh.
2. At the command prompt, when you are prompted for a password, type Pa55w.rd, and then press
Enter.
3. In the SQLCMD window, at the command prompt, type the following commands to query the
Sales.Orders table in the salesapp1 database table, and then press Enter:
5. In the SQLCMD window, at the command prompt, type the following commands to update the
Sales.Orders table in the salesapp1 database, and then press Enter:
6. Verify that the user does NOT have UPDATE permission on the Sales.Orders table.
7. In the SQLCMD window, at the command prompt, type exit, and then press Enter.
2. At the command prompt, when you are prompted for a password, type Pa55w.rd, and then press
Enter.
3. In the SQLCMD window, at the command prompt, type the following commands to query the
Sales.Orders table in the salesapp1 database table, and then press Enter:
5. In the SQLCMD window, at the command prompt, type the following commands to query the
Production.Suppliers table in the salesapp1 database table, and then press Enter:
8. Verify that the user has UPDATE permissions on the Sales.Orders table.
9. In the SQLCMD window, at the command prompt, type exit, and then press Enter.
10. In the Command Prompt window, at the command prompt, type exit, and then press Enter.
11. Close SQL Server Management Studio without saving any changes.
Results: At the end of this exercise, you will have verified your new security settings.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L3-1
3. In the User Account Control dialog, click Yes when prompted to confirm that you want to run the
command file, and wait for the script to finish.
4. On the File menu, point to New, and then click Query with Current Connection.
5. In the new query window, type the following code to grant permissions for the e-commerce
application to read data from the Products.vProductCatalog view and insert rows into the
Sales.SalesOrderHeader and Sales.SalesOrderDetail tables:
USE InternetSales;
GO
GRANT SELECT ON Products.vProductCatalog TO WebApplicationSvc;
GRANT INSERT ON Sales.SalesOrderHeader TO WebApplicationSvc;
GRANT INSERT ON Sales.SalesOrderDetail TO WebApplicationSvc;
GO
6. Below the code that you have just entered, type the following code to grant permissions for all sales
employees and managers to read data from the Customer table:
9. At the command prompt, type the following command to open the sqlcmd utility as
adventureworks\anthonyfrizzell, who is a member of the IT_Support group, and then press Enter:
10. At the command prompt, when you are prompted for a password, type Pa$$w0rd, and press Enter.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-2 Administering a SQL Database
Infrastructure
11. In the SQLCMD window, at the command prompt, type the following command to verify your
identity, and then press Enter:
SELECT suser_name();
GO
12. In the SQLCMD window, at the command prompt, type the following commands to verify that
Anthony can access the Customer table through his membership of the IT_Support global group,
and hence the Database Managers local group and SQL Server login, and then press Enter:
USE InternetSales;
GO
SELECT TOP 5 FirstName, LastName FROM Customers.Customer;
GO
2. In SQL Server Management Studio, in the query window, below the existing code, type the following
code to deny the Database_Managers user SELECT permissions on the Customer table:
3. Select the code that you have just typed, and then click Execute.
4. In the SQLCMD window, at the command prompt, type the following command to verify that
Anthony is now denied access to the Customer table, and then press Enter:
2. In SQL Server Management Studio, in the query window, below the existing code, type the following
code to deny the Database_Managers user SELECT permissions on the Customer table:
3. Select the code that you have just typed, and then click Execute.
4. In the SQLCMD window, at the command prompt, type the following command to verify that
Anthony can access the Customer table through his membership of the Sales_Managers global
group, and hence the InternetSales_Managers local group and SQL Server login, and then press
Enter:
5. In the SQLCMD window, at the command prompt, type exit, and then press Enter.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-3
7. In the Microsoft SQL Server Management Studio dialog box, click No.
8. Leave SQL Server Management Studio open for the next exercise
Results: After completing this exercise, you will have assigned the required object-level permissions.
3. On the File menu, point to New, and then click Query with Current Connection.
4. In the new query window, type the following code to grant permission for the sales managers to run
the ChangeProductPrice stored procedure, and then click Execute:
USE InternetSales;
GO
GRANT EXECUTE ON Products.ChangeProductPrice TO InternetSales_Managers;
GO
2. At the command prompt, when you are prompted for a password, type Pa$$w0rd, and then press
Enter.
3. In the SQLCMD window, at the command prompt, type the following commands to verify that
Deanna can run the stored procedure, and then press Enter:
USE InternetSales;
GO
EXECUTE Products.ChangeProductPrice 1, 2;
GO
4. In the SQLCMD window, at the command prompt, type exit, and then press Enter.
5. In SQL Server Management Studio, in the query window, below the existing code, type the following
code to check that the stored procedure updated the price:
6. Select the code that you have just typed, and then click Execute.
7. On the File menu, click Close.
8. In the Microsoft SQL Server Management Studio dialog box, click No.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-4 Administering a SQL Database
Infrastructure
9. Leave SQL Server Management Studio open for the next exercise.
Results: After completing this exercise, you will have assigned the required EXECUTE permissions on
stored procedures.
3. On the File menu, point to New, and then click Query with Current Connection.
4. In the new query window, type the following code to grant permission for the sales managers to
insert and update data in the Sales schema, and for the sales employees and managers to read data
in the Sales schema:
USE InternetSales;
GO
GRANT INSERT, UPDATE ON SCHEMA::Sales TO InternetSales_Managers;
GRANT SELECT ON Schema::Sales TO InternetSales_Managers;
GRANT SELECT ON Schema::Sales TO InternetSales_Users;
GO
2. At the command prompt, when you are prompted for a password, type Pa$$w0rd, and then press
Enter.
3. In the SQLCMD window, at the command prompt, type the following commands to verify that
Anthony can access and update sales data, and then press Enter:
USE InternetSales;
GO
SELECT TOP 5 SalesOrderID, CustomerID FROM Sales.SalesOrderHeader;
GO
UPDATE Sales.SalesOrderHeader SET CustomerID=28389 WHERE SalesOrderID=43697;
GO
SELECT TOP 5 SalesOrderID, CustomerID FROM Sales.SalesOrderHeader;
GO
4. In the SQLCMD window, at the command prompt, type exit, and then press Enter.
7. In the Microsoft SQL Server Management Studio dialog box, click No.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-5
9. In the Microsoft SQL Server Management Studio dialog box, click No.
Results: After completing this exercise, you will have assigned the required schema-level permissions.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L4-1
3. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
2. In Object Explorer, expand the Security node, right-click the Audits node, and then click New Audit.
3. In the Create Audit dialog box, in the Audit name box, type activity_audit.
4. In the File path box, type D:\Labfiles\Lab04\Starter\Audit, and then click OK.
5. In Object Explorer, expand the Audits node, right-click the activity_audit node, and then click
Enable Audit.
2. In the Create Server Audit Specification dialog box, in the Name box, type audit_logins.
3. In the Audit box, type activity_audit.
4. In the Actions box, in the Audit Action Type list, select the SUCCESSFUL_LOGIN_GROUP value, and
then click OK.
5. In Object Explorer, expand the Server Audit Specifications node, right-click the audit_logins node,
and then click Enable Server Audit Specification.
3. In the Create Database Audit Specification dialog box, in the Name box, type
employees_change_audit, and in the Audit box, type activity_audit.
4. In the Actions box, in the Audit Action Type list, click the INSERT value.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-2 Administering a SQL Database Infrastructure
6. In the Object Name column, in the first row, click the ellipsis (…).
7. In the Select Objects dialog box, in the Enter the object names to select (examples) box, type
HR.Employees, and then click OK.
8. In the Principal Name column, in the first row, click the ellipsis (…).
9. In the Select Objects dialog box, in the Enter the object names to select (examples) box, type
public, and then click OK.
10. On the second row, in the Audit Action Type list, click the UPDATE value.
11. In the Object Class list on the second row, click OBJECT.
12. In the Object Name column, in the second row, click the ellipsis (…).
13. In the Select Objects window, in the Enter the object names to select (examples) box, type
HR.Employees, and then click OK.
14. In the Principal Name column, in the second row, click the ellipsis (…).
15. In the Select Objects window, in the Enter the object names to select (examples) box, type public,
and then click OK.
16. In the Create Database Audit Specification dialog box, click OK.
17. In Object Explorer, expand the Database Audit Specifications node, right-click the
employees_change_audit node, and then click Enable Database Audit Specification.
18. In the Enable Database Audit Specification dialog box, click Close.
4. Highlight the code under the heading Task 1, and click Execute.
SELECT *
FROM sys.fn_get_audit_file(' D:\Labfiles\Lab04\Starter\Audit\*',
default,default)
WHERE session_id = @@SPID;
3. In the Always Encrypted dialog box, on the Introduction page, click Next.
4. On the Column Selection page, under Sales.Customers, select the phone row.
5. Change the value of the Encryption Type box to Randomized, and then click Next.
2. Review the output of the script. The script demonstrates how a change in the connection string to
enable the Column Encryption Setting property allows it to decrypt the Always Encrypted column.
This is possible because the script has access to the column master key in the local Windows key
store.
3. When you have finished reviewing the results, press Enter to close the PowerShell window.
2. Highlight the code under the heading for Task 1, and click Execute.
2. Edit the query under the heading for Task 4 so that it reads:
Task 4: Create a Database Encryption Key and Encrypt the salesapp1 Database
1. Edit the query under the heading for Task 5 so that it reads:
USE salesapp1;
GO
CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_256
ENCRYPTION BY SERVER CERTIFICATE TDE_cert;
GO
3. Highlight the code under the heading for Task 6, and click Execute.
4. Highlight the code under the heading for Task 7, click Execute, and then review the results.
2. In the Detach Database dialog box, select the Drop Connections check box, and then click OK.
5. In Object Explorer, under the MIA-SQL\SQL2 instance, right-click Databases and click Attach.
8. In the Microsoft SQL Server Management Studio dialog box, notice that an error message is
displayed because the certificate with which the database encryption key is protected does not exist
on the MIA-SQL\SQL2 instance. Because of this, the data file cannot be attached, click OK.
10. In Solution Explorer, double-click the Lab Exercise 03 - move DB.sql query.
11. On the Query menu, point to Connection, and then click Change Connection.
12. In the Connect to Database Engine dialog box, connect to the MIA-SQL\SQL2 database engine
using Windows authentication.
13. Highlight the code under the heading for Task 10, and click Execute. This creates a server master key
on MIA-SQL\SQL2.
14. Highlight the code under the heading for Task 11, and click Execute. This creates a certificate in the
master database on MIA-SQL\SQL2 from the backup files you created previously.
15. In Object Explorer, under the MIA-SQL\SQL2 instance, right-click Databases and click Attach.
17. In the Attach Databases dialog box, navigate to the D:\Labfiles\Lab04\Starter\Setupfiles folder,
click the salesapp1.mdf file, and then click OK.
19. Highlight the code under the heading for Task 12, and click Execute. This queries the
Sales.Customers table in the salesapp1 database.
20. Review the query results then close SQL Server Management Studio without saving any files.
3. In the User Account Control dialog box, click Yes to run the command file, and wait for the script to
finish.
Full database backup should complete in approximately 3.4 hours (20 GB/100 MB per minute). This
means that you cannot employ a full database backup strategy only as it would not meet the RPO.
The full database backup should complete by 22:24, which is within the available time window for
backups.
Each log backup should complete in approximately 3.4 minutes (1 GB per hour/3 log backups per
hour/100 MB per minute). This fits within the 20-minute interval and meets the RPO.
Full database restore should complete in approximately 4.3 hours (20 GB/80 MB per minute). Log file
restore should complete in approximately 2.1 hours (10 hours * 1 GB per hour/80 MB per minute) and
meet the RTO.
Notes:
Recovery to last full daily database backup complies with the RPO.
Daily backup should complete in approximately 2 minutes (200 MB/100 MB per minute).
MCT USE ONLY. STUDENT USE PROHIBITED
L5-2 Administering a SQL Database Infrastructure
Full restore should complete in approximately 2.5 minutes and complies with RTO (200 MB/80 MB
per minute).
Results: At the end of this exercise, you will have created a plan to back up two databases.
2. In the Connect to Server dialog box, in the Server name box, type MIA-SQL, and then click
Connect.
3. In Object Explorer, expand Databases, right-click MarketDev, and then click Properties.
4. On the Options page, in the Recovery model list, click Full, and then click OK.
5. In Object Explorer, under Databases, right-click Research, and then click Properties.
6. On the Options page, verify that the Recovery model is set to Simple, and then click Cancel.
Results: At the end of this exercise, you will have modified the database recovery models where required.
2. For the PotentialIssue database, the 15-minute log backups would meet the RPO. A full restore
should take approximately 24 minutes ((200 MB + (7 days * 24 hours) * 10 MB per hour)/80 MB per
minute) which meets the RTO. The full database backup would complete in approximately two
minutes (200 MB/100 MB per minute) which means that the full database would complete in the
available time window. The backup strategy for the PotentialIssue database meets the business
requirements.
Results: At the end of this exercise, you will have assessed the backup strategy.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-1
3. In the User Account Control dialog box, click Yes, and wait for the script to finish.
2. In Object Explorer, expand Databases, right-click AdventureWorks, and then click Properties.
3. In the Database Properties - AdventureWorks dialog box, on the Options page, in the Recovery
model drop-down list, click Simple, and then click OK.
2. In the Back Up Database - AdventureWorks dialog box, ensure that Backup type is set to Full.
3. In the Destination section, click the existing file path, click Remove, and then click Add.
4. In the Select Backup Destination dialog box, in the File name box, type
D:\Backups\AdventureWorks.bak, and then click OK.
5. In the Back Up Database - AdventureWorks dialog box, on the Media Options page, click Back up
to a new media set, and erase all existing backup sets.
8. In the Set backup compression list, click Compress backup, and then click OK.
2. In the query pane, type the following Transact-SQL code, and then click Execute.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
3. Note the number of rows affected, and then close the query pane without saving the file.
2. In the Back Up Database - AdventureWorks dialog box, ensure that Backup type is set to Full.
5. On the Backup Options page, in the Name box, type AdventureWorks-Full Database Backup 2.
6. In the Set backup compression list, click Compress backup, and then click OK.
8. In File Explorer, in the D:\Backups folder, verify the AdventureWorks.bak backup file has increased
in size.
2. In the Backup and Restore Events [AdventureWorks] report, expand Successful Backup
Operations and view the backup operations that have been performed for this database.
3. In the Device Type column, expand each of the Disk (temporary) entries to view details of the
backup media set file.
Results: At the end of this exercise, you will have backed up the AdventureWorks database to
D:\Backups\AdventureWorks.bak using the simple recovery model.
2. In the Database Properties - AdventureWorks dialog box, on the Options page, in the Recovery
model drop-down list, ensure that Full is selected, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-3
2. In the Back Up Database - AdventureWorks dialog box, ensure that Backup type is set to Full.
3. In the Destination section, click the existing file path, click Remove, and then click Add.
4. In the Select Backup Destination dialog box, in the File name box, type
D:\Backups\AWNational.bak, and then click OK.
5. In the Back Up Database - AdventureWorks dialog box, on the Media Options page, click Back up
to a new media set, and erase all existing backup sets.
6. In the New media set name box, type AdventureWorks Backup.
8. In the Set backup compression list, click Compress backup, and then click OK.
10. In File Explorer, in the D:\Backups folder, verify that the backup file AWNational.bak has been
created, and note its size.
2. In the query pane, type the following Transact-SQL code, and then click Execute.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
3. Note the number of rows affected, and then close the query pane without saving the file.
2. In the Back Up Database - AdventureWorks dialog box, in the Backup type list, click Transaction
Log.
4. On the Media Options page, ensure that Back up to the existing media set and Append to the
existing backup set are selected.
5. On the Backup Options page, in the Name box, type AdventureWorks-Transaction Log Backup.
6. In the Set backup compression list, click Compress backup, and then click OK.
8. In File Explorer, in the D:\Backups folder, verify the AWNational.bak backup file has increased in
size.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-4 Administering a SQL Database Infrastructure
2. In the query pane, type the following Transact-SQL code, and then click Execute.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
3. Note the number of rows affected, and then close the query pane without saving the file.
2. In the Back Up Database - AdventureWorks dialog box, in the Backup type list, click Differential.
5. On the Backup Options page, in the Name box, type AdventureWorks-Differential Backup.
6. In the Set backup compression list, click Compress backup, and then click OK.
8. In File Explorer, in the D:\Backups folder, verify the AWNational.bak backup file has increased in
size.
2. In the query pane, type the following Transact-SQL code, and then click Execute.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
3. Note the number of rows affected, and then close the query pane without saving the file.
2. In the Back Up Database - AdventureWorks dialog box, in the Backup type list, click Transaction
Log.
4. On the Media Options page, ensure that Back up to the existing media set and Append to the
existing backup set are selected.
5. On the Backup Options page, in the Name box, type AdventureWorks-Transaction Log Backup 2.
6. In the Set backup compression list, click Compress backup, and then click OK.
8. In File Explorer, in the D:\Backups folder, verify the AWNational.bak backup file has increased in
size.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-5
2. In the query pane, type the following Transact-SQL code, and then click Execute:
RESTORE HEADERONLY
FROM DISK = 'D:\Backups\AWNational.bak';
GO
3. Verify that the backups you performed in this exercise are all listed.
RESTORE FILELISTONLY
FROM DISK = 'D:\Backups\AWNational.bak';
GO
RESTORE VERIFYONLY
FROM DISK = 'D:\Backups\AWNational.bak';
GO
7. Verify that the backup is valid then close the query pane without saving the file.
Results: At the end of this exercise, you will have backed up the national database to
D:\Backups\AWNational.bak.
2. This script creates the required read-only components you need for this Lab, and then close the query
pane without saving the file.
3. In File Explorer, in the D:\Labfiles\Lab06\Starter folder, verify that the backup file
AWReadOnly.bak has been created.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-6 Administering a SQL Database Infrastructure
2. In the query pane, type the following Transact-SQL code, and then click Execute:
3. In File Explorer, in the D:\Labfiles\Lab06\Starter folder, verify that the backup file AWPartial.bak
has been created.
2. In the query pane, type the following Transact-SQL code, and then click Execute.
UPDATE HumanResources.Employee
SET VacationHours = VacationHours + 10 WHERE SickLeaveHours < 30;
3. Note the number of rows affected, and then close the query pane without saving the file.
3. In File Explorer, in the D:\Labfiles\Lab06\Starter folder, verify that the backup file
AWPartialDifferential.bak has been created.
3. View the backups on this backup media, and scroll to the right to view the BackupTypeDescription
column.
5. View the backups on this backup media, and scroll to the right to view the BackupTypeDescription
column.
6. Close SQL Server Management Studio without saving any script files.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-7
Results: At the end of this exercise, you will have backed up the read-only filegroup in the
AdventureWorks database to D:\Backups\AWReadOnly.bak; and you will have backed up the writable
filegroups in the AdventureWorks database to D:\Backups\AWReadWrite.bak.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L7-1
2. In the D:\Labfiles\Lab07\Starter folder, right-click Setup.cmd file, and click Run as administrator.
2. In Object Explorer, expand Databases, and note that the HumanResources database is in a
Recovery Pending state.
5. Click Execute, and note the error message that is displayed. The database cannot be brought online
because the primary data file is lost.
3. In Object Explorer, verify that the HumanResources database is now recovered and ready to use. You
may need to click the Refresh button.
Results: After this exercise, you should have restored the HumanResources database.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-2 Administering a SQL Database Infrastructure
3. In the query pane, type the following Transact-SQL code to attempt to bring the database online:
4. Click Execute, and note the error message that is displayed. There is a problem with the primary data
file.
USE master;
BACKUP LOG InternetSales TO DISK = 'D:\Labfiles\Lab07\Backups\InternetSales.bak'
WITH NO_TRUNCATE;
4. Click Execute, and view the resulting message to verify that the backup is successful.
4. In the Restore Database dialog box, in the Source section, click Device, and then click the ellipsis
(...) button.
8. Note that the backup media contains a full backup, and a transaction log backup (these are the
planned backups in InternetSales.bak), in addition to a copy-only transaction log backup (which is
the tail-log backup). All of these are automatically selected in the Restore column.
9. On the Files page, select the Relocate all files to folder check box.
10. In the Data file folder, delete the existing text, and type D:\Labfiles\Lab07\Backups.
11. In the Log file folder, delete the existing text, and type D:\Labfiles\Lab07\Backups.
12. On the Options page, ensure that the Recovery state is set to RESTORE WITH RECOVERY.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-3
13. In the Script drop-down list, click New Query Editor Window, and then click OK.
14. When the database has been restored successfully, click OK.
15. View the Transact-SQL code that was used to restore the database, noting that the full backup, the
differential backup, and the first transaction log backup, were restored using the NORECOVERY
option. The restore operation for the tail-log backup used the default RECOVERY option to recover
the database.
16. In Object Explorer, under MIA-SQL\SQL2, verify that the InternetSales database is now recovered
and ready to use.
Results: After this exercise, you should have restored the InternetSales database.
3. In the query pane, type the following Transact-SQL code to start a partial restore of the database
from the full backup set in position 1, in the AWDataWarehouse.bak media set:
USE master;
RESTORE DATABASE AWDataWarehouse FILEGROUP='Current'
FROM DISK = 'D:\Labfiles\Lab07\Backups\AWDataWarehouse.bak'
WITH REPLACE, PARTIAL, FILE = 1, NORECOVERY;
4. Click Execute, and view the resulting message to verify that the restore is successful.
5. In Object Explorer, under the MIA-SQL2 instance, right-click the Databases folder, and then click
Refresh; verify that AWDataWarehouse is listed with a “Restoring” status.
2. Select the code you just entered, and click Execute, and view the resulting message to verify that the
restore is successful.
3. In Object Explorer, under MIA-SQL2 instance, right-click the Databases folder, click Refresh, and
then verify that AWDataWarehouse is now shown as online.
5. In Object Explorer, right-click dbo.FactInternetSalesArchive, and then click Select Top 1000 Rows.
Note that you cannot retrieve data from this table, which is stored in the read-only Archive filegroup.
Results: After this exercise, you will have restored the AWDataWarehouse database.
MCT USE ONLY. STUDENT USE PROHIBITED
L8-1
2. In the D:\Labfiles\Lab08\Starter folder, right-click the Setup.cmd file and then click Run as
administrator.
3. In the User Account Control dialog box, click Yes, and wait for the script to finish.
2. In Object Explorer, expand SQL Server Agent, and then expand Jobs to view any existing jobs.
3. Right-click Jobs, and then click New Job.
4. In the New Job dialog box, on the General page, in the Name box, type Backup HumanResources.
6. In the New Job Step dialog box, on the General page, in the Step name box, type Back Up
Database.
7. In the Database list, click HumanResources, and, in the Command box, type the following
command:
8. In the New Job Step dialog box, on the Advanced page, in the Output file box, type
D:\Labfiles\Lab08\Starter\BackupLog.txt, and then click OK.
9. In the New Job dialog box, on the Steps page, click New.
10. In the New Job Step dialog box, on the General page, in the Step name box, type Move Backup
File.
11. In the Type list, click Operating system (CmdExec), and in the Command box, type the following
command, which moves the backup file to the D:\Labfiles\Lab08\Starter folder:
15. In Object Explorer, in the Jobs folder, verify that the job appears.
16. Leave SQL Server Management Studio open for the next exercise.
MCT USE ONLY. STUDENT USE PROHIBITED
L8-2 Administering a SQL Database Infrastructure
Results: At the end of this exercise, you will have created a job named Backup HumanResources.
3. Note that the Start Jobs - MIA-SQL dialog shows an error, and then click Close.
2. In the Log File Viewer - MIA-SQL dialog box, expand the first row by clicking the date cell.
4. Click the Step ID 2 cell, resize the lower panel that contains the text Selected row details: and scroll
to the bottom.
2. In the Job Properties - Backup HumanResources dialog box, on the Steps page, click Move
Backup File, and then click Edit.
3. In the Job Step Properties - Move Backup File dialog box, in the Command box, delete the
existing text, replace it with the following command, and then click OK:
7. Note that in the Start Jobs - MIA-SQL dialog box the job finished with success, and then click Close.
8. In File Explorer, navigate to D:\Labfiles\Lab08\Starter and note that the HumanResources.bak file
is present.
9. Leave SQL Server Management Studio open for the next exercise.
MCT USE ONLY. STUDENT USE PROHIBITED
L8-3
Results: At the end of this exercise, you will have tested the SQL Server Agent job and confirmed that it
executes successfully.
Scroll down to the Notification area section, and then click Turn system icons on or off.
2. On the Turn system icons on or off page, on the Clock row, make sure it is switched On.
6. In the Job properties - Backup HumanResources dialog box, on the Schedules page, click New.
7. In the New Job Schedule dialog box, in the Name box, type Daily Backup, and then in the
Frequency area, in the Occurs list, click Daily.
8. In the Daily frequency section, in the Occurs once at section, set the time to two minutes from the
current system time as shown in the clock in the notification area, and then click OK.
9. In the Job properties - Backup HumanResources dialog box, click OK, and wait until the system
clock shows the scheduled time.
2. If the job is still running, click Refresh until the Status changes to Idle.
3. Verify that the Last Run Outcome for the job is Succeeded, that the Last Run time is the time that
you scheduled previously, and then click Close.
4. Leave SQL Server Management Studio open for the next exercise.
Results: At the end of this exercise, you will have created a schedule for the Backup HumanResources
job.
4. On the Target Servers page, in the Registered servers section, expand Database Engine, and
expand Local Server Groups.
5. Click mia-sql\sql2, and click >, click mia-sql\sql3, and click >, and then click Next.
3. On the Select Plan Properties page, in the Name box, type System Database Backups.
4. In the Schedule section, click Change.
5. In the New Job Schedule dialog box, in the Frequency section, in the Occurs list, click Daily.
6. In the Daily frequency section, in the Occurs once at box, set the time to five minutes from the
current system time as shown in the clock in the notification area, and then click OK.
8. On the Select Target Servers page, select the boxes next to MIA-SQL\SQL2 and MIA-SQL\SQL3,
and then click Next.
9. On the Select Maintenance Tasks page, check Back Up Database (Full), and then click Next.
11. On the Define Backup Database (Full) Task page, on the General tab, in the Database(s) list, click
System databases, and then click OK.
12. On the Destination tab, select the Create a sub-directory for each database check box, in the
Folder box, type D:\, and then click Next.
13. On the Select Report Options page, in the Folder location box, type D:\, and then click Next.
15. On the Maintenance Plan Wizard Progress page, wait for the operation to complete, and then click
Close.
16. In Object Explorer, right-click SQL Server Agent, point to Multi Server Administration, and then
click Manage Target Servers.
17. In the Target Server list, click MIA-SQL\SQL2, and then click Force Poll.
18. In the Force Target Server to Poll MSX Immediately dialog box, click OK.
19. In the Target Server list, click MIA-SQL\SQL3, and then click Force Poll.
20. In the Force Target Server to Poll MSX Immediately dialog box, click OK.
Note: By forcing the target servers to poll the master server, they will get a copy of the
System Database Backup jobs.
MCT USE ONLY. STUDENT USE PROHIBITED
L8-5
21. In the Target Server Status - MIA-SQL dialog box, click Close.
22. In File Explorer, view the D:\ folder. Log files should have been created from each server and for each
system database; a folder should have been created, holding backup files from each SQL Server
instance.
2. In the Job Activity Monitor - MIA-SQL window, note the rows in the Agent Job Activity table for:
4. In the Log File Viewer - MIA-SQL dialog box, review the history information, and then click Close.
5. Right-click System Database Backups.Subplan_1 (Multi-Server), and then click View history.
6. Note that there is less detail in the Log File Viewer for the multiserver jobs. For more detailed
information, use the Job Activity Monitor on the target server.
12. In the Job Activity Monitor - MIA-SQL\SQL2 dialog box, right-click System Database
Backups.Subplan_1 (Multi-Server), and then click View history.
13. Note that there is more information shown on the target server.
14. In the Log File Viewer - MIA-SQL\SQL2 dialog box, click Close.
15. In the Job Activity Monitor - MIA-SQL\SQL2 dialog box, click Close.
16. Close SQL Server Management Studio.
Results: At the end of this exercise, you will have created and executed a maintenance plan across
multiple servers.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L9-1
3. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
2. In Object Explorer, expand SQL Server Agent, expand Jobs, right-click Generate Sales Log, and
then click View History.
3. In the Log File Viewer - MIA-SQL window, expand the first job execution by clicking the plus sign on
a row in the right pane, and then scroll the window to the right so that the Message column is
visible. (The job is started on a schedule, so one or more rows of job history might be visible.)
4. Notice that the failure message for the job step reads as follows:
Non-SysAdmins have been denied permission to run DTS Execution job steps without a
proxy account. The step failed.
5. Click Close.
6. In Object Explorer, right-click Generate Sales Log, and then click Properties.
7. In the Job Properties - Generate Sales Log window, notice that the owner of the job is the
PromoteApp09 login, and then click Cancel.
8. Leave SQL Server Management Studio open for the next exercise. The job step is failing because the
job is owned by a login who is not a member of the sysadmin role.
Results: After completing this exercise, you should have identified the cause of the job failure.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-2 Administering a SQL Database Infrastructure
2. In the New Credential dialog box, in the Credential name box, type ExtractUser.
5. In the Locations dialog box, click Entire Directory, and then click OK.
6. In the Select User, Service Account, or Group dialog box, in the Enter the object name to select
box, type Student, click Check Names, and then click OK.
7. In the New Credential window, in the Password and Confirm password boxes, type Pa55w.rd, and
then click OK.
9. Leave SQL Server Management Studio open for the next exercise.
Results: After completing this exercise, you should have created a credential that references the
ADVENTUREWORKS\Student Windows account.
5. In the Browse for Objects dialog box, select ExtractUser, and then click OK.
7. In the New Proxy Account window, in the Active to the following subsystems box, select SQL
Server Integration Services Package.
9. In the Add Principal dialog box, verify that Principal type has the value SQL Login, select
PromoteApp09, and then click OK.
12. Leave SQL Server Management Studio open for the next exercise.
MCT USE ONLY. STUDENT USE PROHIBITED
L9-3
Results: After completing this exercise, you should have created a proxy account that is suitable for
correcting the problem with the SQL Server Agent job called Generate Sales Log.
2. In the Job Properties - Generate Sales Log window, on the Steps page, click Edit.
3. On the Job Step Properties - Execute Package page, in the Run as box, click ExtractProxy, and
then click OK.
Results: After completing this exercise, the Generate Sales Log SQL Server Agent job should be working
correctly, and the sales_log.csv file should be generated to D:\Labfiles\Lab09\Starter\SalesLog each
time the job runs.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L10-1
3. In the User Account Control dialog box, click Yes and wait for the script to finish.
2. In Object Explorer, expand Management, right-click Database Mail, and then click Configure
Database Mail.
6. In the Add Account to profile 'SQL Server Agent Profile' dialog box, click New Account.
7. In the New Database Mail Account dialog box, enter the following details and click OK:
11. On the Complete the Wizard page, click Finish, and when configuration is complete, click Close.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-2 Administering a SQL Database Infrastructure
2. In the Send Test E-Mail from MIA-SQL dialog box, ensure that the SQL Server Agent Profile
database mail profile is selected.
3. In the To box, type [email protected], and then click Send Test Email.
4. In File Explorer, navigate to the C:\inetpub\mailroot\Drop folder, and verify that an email message
has been created.
5. Double-click the message to view it in Outlook. When you have read the message, close it and
minimize the Drop folder window.
6. In the Database Mail Test E-Mail dialog box (which might be behind SQL Server Management
Studio), click OK.
8. In the query pane, type the following Transact-SQL code, and then click Execute:
9. Review the results. The first result shows system events for Database Mail, and the second shows
records of email messages that have been sent.
Results: After this exercise, you should have configured Database Mail with a new profile named SQL
Server Agent Profile.
3. In the Mail profile drop-down list, click SQL Server Agent Profile.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-3
4. Select Enable fail-safe operator, in the Operator drop-down list, click DBA Team.
5. In the Notify using section, select E-mail, and then click OK.
6. In Object Explorer, right-click SQL Server Agent, and then click Restart.
8. In the Microsoft SQL Server Management Studio dialog box, click Yes.
Results: After this exercise, you should have created operators named Student and DBA Team, and
configured the SQL Server Agent service to use the SQL Server Agent Profile Database Mail profile.
2. In the New Alert dialog box, on the General page, in the Name box, type InternetSales Log Full
Alert.
3. In the Database name list, click InternetSales.
5. On the Response page, select Execute job, and in the drop-down list, click Back Up Log -
InternetSales ([Uncategorized (Local)]).
6. Select Notify operators, and then select E-mail for the Student operator.
7. On the Options page, under Include alert error text in, select E-mail, and then click OK.
3. In the Job Properties - Back Up Database - AWDataWarehouse dialog box, on the Notifications
page, select E-mail.
5. In the second drop-down list, click When the job fails, and then click OK.
7. In the Job Properties - Back Up Database - HumanResources dialog box, on the Notifications
page, select E-mail.
8. In the first drop-down list, click Student.
9. In the second drop-down list, click When the job fails, and then click OK.
11. In the Job Properties - Back Up Database - InternetSales dialog box, on the Notifications page,
select E-mail.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-4 Administering a SQL Database Infrastructure
13. In the second drop-down list, click When the job completes, and then click OK.
14. Right-click the Back Up Log - InternetSales job, and then click Properties.
15. In the Job Properties - Back Up Log - InternetSales dialog box, on the Notifications page, select
E-mail.
16. In the first drop-down list, click Student.
17. In the second drop-down list, click When the job completes, and then click OK.
18. Expand the Operators folder, right-click Student, and then click Properties.
19. In the Student Properties dialog box, on the Notifications page, click Jobs, verify the job
notifications that have been defined for this operator, and then click Cancel.
Results: After this exercise, you should have created an alert named InternetSales Log Full Alert.
Also, you should have configured the Back Up Database - AWDataWarehouse, Back Up Database -
HumanResources, Back Up Database - InternetSales, and Back Up Log - InternetSales jobs to send
notifications.
3. On the toolbar, click Execute and wait for the script to finish. When the log file for the InternetSales
database is full, error 9002 occurs.
4. In Object Explorer, expand Alerts, right-click InternetSales Log Full Alert, and then click Properties.
5. In the 'InternetSales Log Full Alert' alert properties dialog box, on the History page, note the
Date of last alert and Date of last response values, and then click Cancel.
6. In File Explorer, in the C:\inetpub\mailroot\Drop folder, verify that four new email messages have
been created.
7. Double-click the new email messages to view them in Outlook. They should include a notification that
the transaction log was filled, and a notification that the Back Up Log - InternetSales job completed.
When you have finished checking them, close all email messages and minimize the Drop window.
2. When the job has completed, note that it failed and click Close.
3. In Object Explorer, under Jobs, right-click Back Up Database - HumanResources, and click Start
Job at Step.
MCT USE ONLY. STUDENT USE PROHIBITED
L10-5
4. When the job has completed, note that it succeeded and click Close.
5. In Object Explorer, under Jobs, right-click Back Up Database - InternetSales, and click Start Job at
Step.
6. When the job has completed, note that it succeeded and click Close.
7. Under the Operators folder, right-click Student, and then click Properties.
8. In the Student Properties dialog box, on the History page, note the date and time of the most
recent notification by email, and then click Cancel.
9. In File Explorer, in the C:\inetpub\mailroot\Drop folder, verify that new email messages have been
created.
10. Open each of the messages and verify that they include a failure notification for the Back Up
Database - AWDataWarehouse job and a completion notification for the Back Up Database -
InternetSales job, but no notification regarding the Back Up Database - HumanResources job.
11. When you have read the messages, close them and close the Drop window.
12. Close SQL Server Management Studio without saving any files.
Results: After this exercise, you will have verified that the notifications you configured for backups of the
AWDataWarehouse, HumanResources, and InternetSales databases work as expected.
You will also verify that an alert is triggered when the transaction log of the Internet Sales database is
full.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L11-1
3. In the User Account Control dialog box, click Yes, and wait for the script to finish.
3. In the console, type Update-Help, and then press Enter. Wait for the help files to update.
4. At the command prompt, type Get-Help, and then press Enter to display the summary help page.
5. At the command prompt, type Get-Help Get-ChildItem, and then press Enter to display help for the
cmdlet.
6. At the command prompt, type Get-Command Remove-*, and then press Enter to get a list of all the
cmdlets that start with the verb Remove-.
7. At the command prompt, type Get-Command *sql*, and then press Enter to get a list of all the
cmdlets that include “sql”.
8. At the command prompt, type Get-Item, and then press TAB to view tab-completion of the cmdlet.
2. At the command prompt, type Set-Location Env:, and then press Enter to go to the Environment
location.
3. At the command prompt, type Get-ChildItem, and then press Enter to list the contents of the
location.
4. At the command prompt, type Get-PSDrive, and then press Enter to list the available drives.
6. At the command prompt, type Import-Module SQLPS, and then press Enter to import the SQL
PowerShell module. You can ignore any warnings that appear.
MCT USE ONLY. STUDENT USE PROHIBITED
L11-2 Administering a SQL Database Infrastructure
7. At the command prompt, type Get-PsProvider, and then press Enter to list the PowerShell providers.
Note that SqlServer now appears in the list.
8. At the command prompt, type Get-PSDrive, and then press Enter to list the available drives. Note
that SqlServer now appears in the list.
9. At the command prompt, type Get-Help Get-PSProvider, press Enter, and then read the returned
information to learn more about the PSProvider cmdlet.
10. At the command prompt, type Get-Help Get-PSDrive, press Enter, and then read the returned
information to learn more about the PSDrive cmdlet.
Results: After completing this exercise, you will have investigated PowerShell help and the SQL
PowerShell provider.
6. Select the code under the #2# comment, and then on the toolbar, click Run Selection to change the
location.
7. Select the code under the #3# comment, and then on the toolbar, click Run Selection to get the
database object.
8. Select the code under the #4# comment, and then on the toolbar, click Run Selection to display the
database properties.
9. Select the code under the #5# comment, and then on the toolbar, click Run Selection to display the
database option.
3. Select the code under the #1# comment, and then on the toolbar, click Run Selection to prepare the
script.
4. Select the code under the #2# comment, and then on the toolbar, click Run Selection to change the
location.
MCT USE ONLY. STUDENT USE PROHIBITED
L11-3
5. Select the code under the #3# comment, and then on the toolbar, click Run Selection to get the
database object.
6. Select the code under the #4# comment, and then on the toolbar, click Run Selection to change the
compatibility level of the AdventureWorks2016 database.
7. Select the code under the #5# comment, and then on the toolbar, click Run Selection to change the
ANSI nulls, autoshrink, read only, and recovery model options.
3. Select the code under the #1# comment, and then on the toolbar, click Run Selection to display
“import the SQL module”. Ignore any warnings that may appear.
4. Select the code under the #2# comment, and then on the toolbar, click Run Selection to change the
location.
5. Select the code under the #3# comment, and then on the toolbar, click Run Selection to get the
server object.
6. Select the code under the #4# comment, and then on the toolbar, click Run Selection to display the
Settings object.
7. Select the code under the #5# comment, and then on the toolbar, click Run Selection to change the
login mode to mixed authentication.
8. Select the code under the #6# comment, and then on the toolbar, click Run Selection to change the
login mode back to integrated authentication.
9. Select the code under the #7# comment, and then on the toolbar, click Run Selection to examine
the UserOptions object.
10. Select the code under the #8# comment, and then on the toolbar, click Run Selection to change
some server settings.
11. Select the code under the #9# comment, and then on the toolbar, click Run Selection to reset the
server settings.
Results: After completing this lab exercise, you will have PowerShell scripts to show the IT Director.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L12-1
3. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
3. If a message is displayed asking you to confirm a change in execution policy, type Y, and then press
Enter.
4. Wait for the workload to complete, which should take about a minute, and then press Enter to close
the Windows PowerShell window.
3. In the Open Project dialog box, navigate to the D:\Labfiles\Lab12\Starter\Project folder, click
Project.ssmssln, and then click Open.
5. Edit the code under the comment that begins Task 2 so that it reads as follows:
6. Select the query that you edited in the previous step, and then click Execute.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-2 Administering a SQL Database Infrastructure
2. Select the query that you edited in the previous step, and then click Execute.
3. In the Results pane, click any of the row values in the deadlock_data column to view the deadlock
XML in detail.
Results: After completing this exercise, you will have extracted deadlock data from the SQL Server.
4. On the Set Session Properties page, in the Session name box, type track page splits, and then
click Next.
6. On the Select Events To Capture page, in the Event library section, in the Channel column header,
click the drop-down button, and then select the Debug check box.
7. In the Search Events box, type transaction_log, double-click the transaction_log row in the Event
Library list, which will add it to the Selected events list, and then click Next.
9. On the Set Session Event Filters page, click Click here to add a clause.
10. In the Field drop-down list, click sqlserver.database_name.
11. In the Value box, type AdventureWorks, and then click Finish.
13. In Object Explorer, expand Sessions, right-click track page splits, and then click Properties.
14. In the Session Properties dialog box, on the Events page, click Configure.
MCT USE ONLY. STUDENT USE PROHIBITED
L12-3
16. On the Filter (Predicate) tab, click Click here to add a clause.
21. On the Data Storage page, click Click here to add a target.
24. In the Base buckets on section, click Field, in the Field list, click alloc_unit_id, and then click OK.
25. In Object Explorer, right-click track page splits, and then click Start Session.
2. If a message is displayed asking you to confirm a change in execution policy, type Y, and then press
Enter.
3. Wait for the workload to complete. This should take about 60 seconds.
USE AdventureWorks;
SELECT CAST(target_data AS XML) AS target_data
FROM sys.dm_xe_sessions AS xs
JOIN sys.dm_xe_session_targets xt
ON xs.address = xt.event_session_address
WHERE xs.name = 'track page splits'
AND xt.target_name = 'histogram';
2. Select the query that you edited in the previous step, and then click Execute.
3. In the results pane, click the returned XML to review the data.
2. Select the query that you edited in the previous step, and then click Execute.
2. Select the query that you edited in the previous step, and then click Execute.
3. Review the objects affected by page splits.
Results: After completing this exercise, you will have extracted page split data from SQL Server.
MCT USE ONLY. STUDENT USE PROHIBITED
L13-1
3. In the User Account Control dialog box, click Yes, leave the window open as it is creating a load on
the database.
3. In the MIA-SQL – Activity Monitor tab, click on Recent Expensive Queries row to expand it.
4. Note the first row in the list starts with, SELECT total.name.
5. In the Object Explorer pane, expand the Management folder, right-click Data Collection, click
Tasks, then click Configure Management Data Warehouse.
8. In the New Database dialog, in the Database name textbox, type MDW.
9. Click OK.
10. On the Configure Management Data Warehouse Storage page, click Next.
13. On the Configure Data Collection Wizard Progress page, click Close.
14. Note in the Object Explorer, a new MDW database has been created.
15. Leave SQL Server Management Studio open for the next task.
3. On the Setup Data Collection Sets page, next to the Server name textbox, click the … button.
4. In the Connect to Server dialog, check the Server name is MIA-SQL, then click Connect.
5. On the Setup Data Collection Sets page, in the Database name dropdown, select MDW.
MCT USE ONLY. STUDENT USE PROHIBITED
L13-2 Administering a SQL Database Infrastructure
6. In the Select data collector sets you want to enable, check System Data Collection Sets, and then
click Next.
7. On the Complete the Wizard page, click Finish.
9. In the Object Explorer pane, expand Management, expand Data Collection, expand System Data
Collection Sets, right-click Query Statistics, then click Properties.
10. In the Data Collection Set Properties dialog, under Data Collection and upload, select Non-
cached – Collect and upload data on the same schedule.
12. In the Pick Schedule for job dialog, click the row with ID 2, Occurs every day every 5 minutes.
15. In the Object Explorer pane, expand the Databases folder, right-click the MDW database, click
Reports, click Management Data Warehouse, then click Management Data Warehouse
Overview.
16. On the Management Data Warehouse Overview: MDW report, under the Query Statistics
column, click the date hyperlink.
17. On the Query Statics History report, in the Query # table, click the SELECT total.Name hyperlink.
18. On the Query Details report, scroll to the bottom, in the Top Query Plans by Average CPU Per
Execution table, in the Plan # column, click the 1 hyperlink.
19. On the Query Plan Details report, at the bottom of the page, click the View graphical query
execution plan hyperlink.
23. Leave SQL Server Management Studio open for the next exercise.
Results: After completing this exercise, you should have configured a Management Data Warehouse
called MDW on the MIA-SQL instance.
2. In the User Account Control dialog box, click Yes, leave the window open as it is creating a load on
the database.
MCT USE ONLY. STUDENT USE PROHIBITED
L13-3
3. Right-click the top query starting SELECT total.name, then click Show Execution Plan.
4. Note that SQL Server has identified that there is a missing index that if created could improve
performance.
8. In the Object Explorer pane, expand the Management folder, expand Data Collection, expand the
System Data Collection Sets folder, right-click on Query Statistics, then click Collect and Upload
Now.
9. In the Collect and upload Data Collection Sets dialog, click Close.
10. In the Query Statistics History report, at the top of the report, click the Refresh.
12. In the table below, click the top hyperlink starting SELECT total.name.
13. On the Query Details report, scroll to the bottom of the report, in the Top Query Plans By Average
CPU Per Execution table, in the Plan # column, click the hyperlink 1.
14. Note that there is a Missing Indexes textbox with a suggestion to create an index to improved
performance.
Use Activity Monitor to see which running queries are the most expensive
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L14-1
3. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
3. In Object Explorer, expand the MIA-SQL server, expand Security, and then expand Logins.
4. Note the PromoteApp login has an icon with a red down arrow—the login is disabled.
5. Right-click PromoteApp, and click Properties.
6. In the Login Properties - PromoteApp dialog box, on the Status page, click Enabled, and then click
OK.
7. In Object Explorer, right-click Logins, and click Refresh. Note that the login is now enabled.
8. Leave SQL Server Management Studio open for the next exercise.
Results: After this exercise, you will have investigated and resolved a SQL login issue; the PromoteApp
login will be functioning properly.
MCT USE ONLY. STUDENT USE PROHIBITED
L14-2 Administering a SQL Database Infrastructure
2. Read the error message, and notice that it includes more information than users reported.
Particularly, it says:
The report server can’t connect to its database. Make sure the database is running
and accessible.
3. This might indicate a problem with the MIA-SQL\SQL2 database engine instance.
5. In Event Viewer, in the left pane, expand Windows Logs, and then click System.
6. Click on each of the three most recent messages with a Level of Error and a Source of Service
Control Manager. Notice that the oldest of the three messages contains the following message:
3. In Sql Server Configuration Manager, in the left pane, click SQL Server Services.
4. In the right pane, note that the SQL Server (SQL2) service is not running.
6. In the SQL Server (SQL2) Properties dialog box, on the Log On tab, in the Password and Confirm
password boxes, type Pa55w.rd, and then click OK.
7. Right-click SQL Server (SQL2), and click Start. Notice that the service starts normally.
8. Close SQL Server Configuration Manager.
9. In Internet Explorer, click the Refresh button. Notice that Reporting Services is now accessible.
2. When the script completes, press any key to close the command prompt window.
2. On the General page, notice that the login is configured for Windows authentication.
3. On the Status page, notice that the login has permission to connect to the database engine, and that
it is enabled. Click Cancel.
4. In Object Explorer, expand Management, expand SQL Server Logs, right-click the log with the name
that begins Current, and then click View SQL Server Log.
5. In the Log File Viewer – MIA-SQL window, in the right pane, locate the first entry with a Source of
Logon. Notice that the error message begins:
6. Click Close.
7. Leave SQL Server Management Studio open for the next exercise.
8. The user’s attempt to log in is failing because he is attempting to provide a username and password,
when he should be using trusted authentication. He should use sqlcmd with the -E switch to use
trusted authentication, not the -U and -P switches—which are used to provide a user name and
password for SQL Server authentication.
Results: At the end of this exercise, you will be able to explain to the user why he cannot connect to the
database.
2. In the Start Jobs - MIA-SQL dialog box, note the failure, and then click Close.
MCT USE ONLY. STUDENT USE PROHIBITED
L14-4 Administering a SQL Database Infrastructure
2. In the Log File Viewer - MIA-SQL window, expand the first entry.
3. Click on the row with a Step ID value of 1, in the bottom pane, scroll down to view the error
message. Notice that the job is attempting to call a stored procedure which doesn’t exist. Click Close.
5. In the Job Properties - Generate File List window, on the Steps page, click Edit.
6. In the Job Step Properties - Execute Procedure window, in the Database and Command boxes,
observe that the job is attempting to execute a stored procedure in the InternetSales database.
7. In Object Explorer, under MIA-SQL, expand Databases, expand Internet Sales, expand
Programmability, and then expand Stored Procedures. Observe that a stored procedure exists in
the database with the name dbo.usp_GenerateFileList.
8. In the Job Step Properties - Execute Procedure window, edit the content of the Command box so
that it reads the following, and then click OK:
EXECUTE dbo.usp_GenerateFileList
GO
11. Leave SQL Server Management Studio open for the next exercise.
Results: After this exercise, you will have investigated and resolved a job execution issue.
2. Notice that three command prompt windows are opened by the script; these windows represent
users of the InternetSales database. Do not interact with or close these windows; two of them will
close when the issue is resolved.
3. In the filter list, click InternetSales. This filters the list to show only connections to the InternetSales
database.
4. The filtered list will include three entries where the value of the Application Name column is
SQLCMD. Notice that for one of these rows (the row with the lowest SID value) the value of the Head
Blocker column is 1, and that the other two rows have values in the Blocked By column.
5. Click the row that has a Head Blocker value of 1, right-click the row, and then click Details.
6. In the Session details… dialog box, the text of the Transact-SQL command being executed by the
session is displayed. Also notice that the command includes a BEGIN TRANSACTION statement
without a corresponding COMMIT or ROLLBACK; this open transaction is blocking other sessions from
accessing the Sales.Orders table.
7. To enable other users to continue to work with the application, click Kill Process to kill the blocking
session.
9. After a few seconds (the polling interval of Activity Monitor) notice the SQLCMD connections
disappear from the list. Sessions are no longer blocked by the hanging transaction.
Results: At the end of this exercise, you will have resolved a performance issue.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L15-1
4. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
7. In the Microsoft Access Database engine 2016 (English) Setup dialog, click Next.
8. On the End-User License Agreement page, click I accept the terms in the License Agreement,
then click Next.
9. Click Install.
2. Review the contents of the file. Close the file when you have finished your review.
2. In the SQL Server Import and Export Wizard window, click Next.
3. On the Choose a Data Source page, in the Data Source box, click Microsoft Excel.
5. In the Excel Version list, click Microsoft Excel 2016, and then click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
L15-2 Administering a SQL Database Infrastructure
6. On the Choose a Destination page, in the Destination list, click SQL Server Native Client 11.0.
7. Verify that the Server Name box contains the value (local), and that Use Windows Authentication
is selected.
9. On the Specify Table Copy or Query page, verify that Copy data from one or more tables or
views is selected, and then click Next.
10. On the Select Source Tables and Views page, in the first Destination: (local) list, click the
dropdown and select [Accounts].[CurrencyCode], and then click Edit Mappings.
11. In the Column Mappings dialog box, verify that Append rows to destination table is selected.
12. In the Mappings box, in the first Destination box to the right of alphabetic_code, click
currency_code.
13. In the second Destination box to the right of numeric_code, click currency_numeric_code, and
then click OK.
14. On the Select Source Tables and Views page, click Next.
16. On the Save and Run Package page, verify that Run immediately is selected, then click Finish.
24. In Solution Explorer, expand Queries, and then double-click Lab Exercise 01 - currency codes.sql.
27. Leave SQL Server Management Studio open for the next exercise.
2. At the command prompt, type the following code, and then press Enter:
bcp AdventureWorks.Accounts.ExchangeRate in
D:\Labfiles\Lab15\Starter\Import\currency_exchange_rates.txt -S MIA-SQL -T -c -t ,
2. Highlight the query under the heading for Task 1 and click Execute. Note that the foreign key
constraints are not trusted.
3. Highlight the query under the heading for Task 2 and click Execute.
4. Highlight the query under the heading for Task 1 and click Execute. Note that the foreign key
constraints are now trusted.
2. Highlight the query under the heading for Task 1 and click Execute.
5. Right-click Package1.dtsx, click Rename, type prospects, and then press Enter.
8. Right-click Data Flow Task, click Rename, type Export Prospects, and then press Enter.
2. In the Source Assistant - Add New Source dialog box, in the Select source type box, click SQL
Server.
3. In the Select connection managers box, click New, and then click OK.
4. In the Connection Manager dialog box, in the Server name box, type MIA-SQL, and verify that Use
Windows Authentication is selected.
5. In the Select or enter a database name list, click AdventureWorks, and then click OK.
6. Right-click OLE DB Source, click Rename, type usp_prospect_list, and then press Enter.
9. In the SQL command text box, type EXEC Sales.usp_prospect_list;, and then click OK.
2. Right-click Flat File Destination, click Rename, type prospects file, and then press Enter.
3. Click usp_prospect_list, click on the left (blue) arrow underneath the usp_prospect_list object, and
then click prospects file.
6. In the Flat File Format dialog box, click Delimited, and then click OK.
7. In the Flat File Connection Manager Editor dialog box, on the General page, in the Connection
manager name box, type prospect file connection.
9. Select the Column names in the first data row check box.
10. On the Columns page, in the Row Delimiter box, verify that {CR}{LF} is selected.
11. In the Column delimiter box, verify that Comma {,} is selected, and then click OK.
12. In the Flat File Destination Editor dialog box, on the Mappings page, note the mappings between
the Input Column and the Destination Column, and then click OK.
2. In the Unpack Microsoft SQL Server DAC Package File dialog box, in the Files will be unpacked
to this folder box, verify the location is
D:\Labfiles\Lab15\Starter\FixedAssets\FixedAssets_1.0.9.1, and then click Unpack.
3. When unpacking is complete, in the FixedAssets_1.0.9.1 window, double-click the model.sql script
to review how the DACPAC is going to behave.
2. In the Deploy Data-tier Application dialog box, on the Introduction page, click Next.
3. On the Select Package page, in the DAC package (file name with the .dacpac extension) box,
type D:\Labfiles\Lab15\Starter\FixedAssets\FixedAssets_1.0.9.1.dacpac, and then click Next.
7. In Object Explorer, under MIA-SQL, right-click Databases, and then click Refresh.
8. Expand Databases, and verify that the FixedAssets_1.0.9.1 database has been created successfully.