0% found this document useful (0 votes)
164 views246 pages

10988C ENU TrainerHandbook PDF

Uploaded by

hanuman sqlboy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
164 views246 pages

10988C ENU TrainerHandbook PDF

Uploaded by

hanuman sqlboy
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 246

MCT USE ONLY.

STUDENT USE PROHIBITED


O F F I C I A L M I C R O S O F T L E A R N I N G P R O D U C T

10988C
Managing SQL Business Intelligence
Operations
MCT USE ONLY. STUDENT USE PROHIBITED
ii Managing SQL Business Intelligence Operations

Information in this document, including URL and other Internet Web site references, is subject to change
without notice. Unless otherwise noted, the example companies, organizations, products, domain names,
e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with
any real company, organization, product, domain name, e-mail address, logo, person, place or event is
intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the
user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in
or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical,
photocopying, recording, or otherwise), or for any purpose, without the express written permission of
Microsoft Corporation.

Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property
rights covering subject matter in this document. Except as expressly provided in any written license
agreement from Microsoft, the furnishing of this document does not give you any license to these
patents, trademarks, copyrights, or other intellectual property.

The names of manufacturers, products, or URLs are provided for informational purposes only and
Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding
these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a
manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links
may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not
responsible for the contents of any linked site or any link contained in a linked site, or any changes or
updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission
received from any linked site. Microsoft is providing these links to you only as a convenience, and the
inclusion of any link does not imply endorsement of Microsoft of the site or the products contained
therein.
© 2018 Microsoft Corporation. All rights reserved.

Microsoft and the trademarks listed at


https://www.microsoft.com/en-us/legal/intellectualproperty/trademarks/en-us.aspx are trademarks of the
Microsoft group of companies. All other trademarks are property of their respective owners

Product Number: 10988C

Part Number (if applicable): X21-64451

Released: 02/2018
MCT USE ONLY. STUDENT USE PROHIBITED
MICROSOFT LICENSE TERMS
MICROSOFT INSTRUCTOR-LED COURSEWARE

These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its
affiliates) and you. Please read them. They apply to your use of the content accompanying this agreement which
includes the media on which you received it, if any. These license terms also apply to Trainer Content and any
updates and supplements for the Licensed Content unless other terms accompany those items. If so, those terms
apply.

BY ACCESSING, DOWNLOADING OR USING THE LICENSED CONTENT, YOU ACCEPT THESE TERMS.
IF YOU DO NOT ACCEPT THEM, DO NOT ACCESS, DOWNLOAD OR USE THE LICENSED CONTENT.

If you comply with these license terms, you have the rights below for each license you acquire.

1. DEFINITIONS.

a. “Authorized Learning Center” means a Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, or such other entity as Microsoft may designate from time to time.

b. “Authorized Training Session” means the instructor-led training class using Microsoft Instructor-Led
Courseware conducted by a Trainer at or through an Authorized Learning Center.

c. “Classroom Device” means one (1) dedicated, secure computer that an Authorized Learning Center owns
or controls that is located at an Authorized Learning Center’s training facilities that meets or exceeds the
hardware level specified for the particular Microsoft Instructor-Led Courseware.

d. “End User” means an individual who is (i) duly enrolled in and attending an Authorized Training Session
or Private Training Session, (ii) an employee of a MPN Member, or (iii) a Microsoft full-time employee.

e. “Licensed Content” means the content accompanying this agreement which may include the Microsoft
Instructor-Led Courseware or Trainer Content.

f. “Microsoft Certified Trainer” or “MCT” means an individual who is (i) engaged to teach a training session
to End Users on behalf of an Authorized Learning Center or MPN Member, and (ii) currently certified as a
Microsoft Certified Trainer under the Microsoft Certification Program.

g. “Microsoft Instructor-Led Courseware” means the Microsoft-branded instructor-led training course that
educates IT professionals and developers on Microsoft technologies. A Microsoft Instructor-Led
Courseware title may be branded as MOC, Microsoft Dynamics or Microsoft Business Group courseware.

h. “Microsoft IT Academy Program Member” means an active member of the Microsoft IT Academy
Program.

i. “Microsoft Learning Competency Member” means an active member of the Microsoft Partner Network
program in good standing that currently holds the Learning Competency status.

j. “MOC” means the “Official Microsoft Learning Product” instructor-led courseware known as Microsoft
Official Course that educates IT professionals and developers on Microsoft technologies.

k. “MPN Member” means an active Microsoft Partner Network program member in good standing.
MCT USE ONLY. STUDENT USE PROHIBITED
l. “Personal Device” means one (1) personal computer, device, workstation or other digital electronic device
that you personally own or control that meets or exceeds the hardware level specified for the particular
Microsoft Instructor-Led Courseware.

m. “Private Training Session” means the instructor-led training classes provided by MPN Members for
corporate customers to teach a predefined learning objective using Microsoft Instructor-Led Courseware.
These classes are not advertised or promoted to the general public and class attendance is restricted to
individuals employed by or contracted by the corporate customer.

n. “Trainer” means (i) an academically accredited educator engaged by a Microsoft IT Academy Program
Member to teach an Authorized Training Session, and/or (ii) a MCT.

o. “Trainer Content” means the trainer version of the Microsoft Instructor-Led Courseware and additional
supplemental content designated solely for Trainers’ use to teach a training session using the Microsoft
Instructor-Led Courseware. Trainer Content may include Microsoft PowerPoint presentations, trainer
preparation guide, train the trainer materials, Microsoft One Note packs, classroom setup guide and Pre-
release course feedback form. To clarify, Trainer Content does not include any software, virtual hard
disks or virtual machines.

2. USE RIGHTS. The Licensed Content is licensed not sold. The Licensed Content is licensed on a one copy
per user basis, such that you must acquire a license for each individual that accesses or uses the Licensed
Content.

2.1 Below are five separate sets of use rights. Only one set of rights apply to you.

a. If you are a Microsoft IT Academy Program Member:


i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft
Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is
in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not
install the Microsoft Instructor-Led Courseware on a device you do not own or control.
ii. For each license you acquire on behalf of an End User or Trainer, you may either:
1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End
User who is enrolled in the Authorized Training Session, and only immediately prior to the
commencement of the Authorized Training Session that is the subject matter of the Microsoft
Instructor-Led Courseware being provided, or
2. provide one (1) End User with the unique redemption code and instructions on how they can
access one (1) digital version of the Microsoft Instructor-Led Courseware, or
3. provide one (1) Trainer with the unique redemption code and instructions on how they can
access one (1) Trainer Content,
provided you comply with the following:
iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid
license to the Licensed Content,
iv. you will ensure each End User attending an Authorized Training Session has their own valid licensed
copy of the Microsoft Instructor-Led Courseware that is the subject of the Authorized Training
Session,
v. you will ensure that each End User provided with the hard-copy version of the Microsoft Instructor-
Led Courseware will be presented with a copy of this agreement and each End User will agree that
their use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement
prior to providing them with the Microsoft Instructor-Led Courseware. Each individual will be required
to denote their acceptance of this agreement in a manner that is enforceable under local law prior to
their accessing the Microsoft Instructor-Led Courseware,
vi. you will ensure that each Trainer teaching an Authorized Training Session has their own valid
licensed copy of the Trainer Content that is the subject of the Authorized Training Session,
MCT USE ONLY. STUDENT USE PROHIBITED
vii. you will only use qualified Trainers who have in-depth knowledge of and experience with the
Microsoft technology that is the subject of the Microsoft Instructor-Led Courseware being taught for
all your Authorized Training Sessions,
viii. you will only deliver a maximum of 15 hours of training per week for each Authorized Training
Session that uses a MOC title, and
ix. you acknowledge that Trainers that are not MCTs will not have access to all of the trainer resources
for the Microsoft Instructor-Led Courseware.

b. If you are a Microsoft Learning Competency Member:


i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft
Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is
in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not
install the Microsoft Instructor-Led Courseware on a device you do not own or control.
ii. For each license you acquire on behalf of an End User or Trainer, you may either:
1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End
User attending the Authorized Training Session and only immediately prior to the
commencement of the Authorized Training Session that is the subject matter of the Microsoft
Instructor-Led Courseware provided, or
2. provide one (1) End User attending the Authorized Training Session with the unique redemption
code and instructions on how they can access one (1) digital version of the Microsoft Instructor-
Led Courseware, or
3. you will provide one (1) Trainer with the unique redemption code and instructions on how they
can access one (1) Trainer Content,
provided you comply with the following:
iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid
license to the Licensed Content,
iv. you will ensure that each End User attending an Authorized Training Session has their own valid
licensed copy of the Microsoft Instructor-Led Courseware that is the subject of the Authorized
Training Session,
v. you will ensure that each End User provided with a hard-copy version of the Microsoft Instructor-Led
Courseware will be presented with a copy of this agreement and each End User will agree that their
use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement prior to
providing them with the Microsoft Instructor-Led Courseware. Each individual will be required to
denote their acceptance of this agreement in a manner that is enforceable under local law prior to
their accessing the Microsoft Instructor-Led Courseware,
vi. you will ensure that each Trainer teaching an Authorized Training Session has their own valid
licensed copy of the Trainer Content that is the subject of the Authorized Training Session,
vii. you will only use qualified Trainers who hold the applicable Microsoft Certification credential that is
the subject of the Microsoft Instructor-Led Courseware being taught for your Authorized Training
Sessions,
viii. you will only use qualified MCTs who also hold the applicable Microsoft Certification credential that is
the subject of the MOC title being taught for all your Authorized Training Sessions using MOC,
ix. you will only provide access to the Microsoft Instructor-Led Courseware to End Users, and
x. you will only provide access to the Trainer Content to Trainers.
MCT USE ONLY. STUDENT USE PROHIBITED
c. If you are a MPN Member:
i. Each license acquired on behalf of yourself may only be used to review one (1) copy of the Microsoft
Instructor-Led Courseware in the form provided to you. If the Microsoft Instructor-Led Courseware is
in digital format, you may install one (1) copy on up to three (3) Personal Devices. You may not
install the Microsoft Instructor-Led Courseware on a device you do not own or control.
ii. For each license you acquire on behalf of an End User or Trainer, you may either:
1. distribute one (1) hard copy version of the Microsoft Instructor-Led Courseware to one (1) End
User attending the Private Training Session, and only immediately prior to the commencement
of the Private Training Session that is the subject matter of the Microsoft Instructor-Led
Courseware being provided, or
2. provide one (1) End User who is attending the Private Training Session with the unique
redemption code and instructions on how they can access one (1) digital version of the
Microsoft Instructor-Led Courseware, or
3. you will provide one (1) Trainer who is teaching the Private Training Session with the unique
redemption code and instructions on how they can access one (1) Trainer Content,
provided you comply with the following:
iii. you will only provide access to the Licensed Content to those individuals who have acquired a valid
license to the Licensed Content,
iv. you will ensure that each End User attending an Private Training Session has their own valid licensed
copy of the Microsoft Instructor-Led Courseware that is the subject of the Private Training Session,
v. you will ensure that each End User provided with a hard copy version of the Microsoft Instructor-Led
Courseware will be presented with a copy of this agreement and each End User will agree that their
use of the Microsoft Instructor-Led Courseware will be subject to the terms in this agreement prior to
providing them with the Microsoft Instructor-Led Courseware. Each individual will be required to
denote their acceptance of this agreement in a manner that is enforceable under local law prior to
their accessing the Microsoft Instructor-Led Courseware,
vi. you will ensure that each Trainer teaching an Private Training Session has their own valid licensed
copy of the Trainer Content that is the subject of the Private Training Session,
vii. you will only use qualified Trainers who hold the applicable Microsoft Certification credential that is
the subject of the Microsoft Instructor-Led Courseware being taught for all your Private Training
Sessions,
viii. you will only use qualified MCTs who hold the applicable Microsoft Certification credential that is the
subject of the MOC title being taught for all your Private Training Sessions using MOC,
ix. you will only provide access to the Microsoft Instructor-Led Courseware to End Users, and
x. you will only provide access to the Trainer Content to Trainers.

d. If you are an End User:


For each license you acquire, you may use the Microsoft Instructor-Led Courseware solely for your
personal training use. If the Microsoft Instructor-Led Courseware is in digital format, you may access the
Microsoft Instructor-Led Courseware online using the unique redemption code provided to you by the
training provider and install and use one (1) copy of the Microsoft Instructor-Led Courseware on up to
three (3) Personal Devices. You may also print one (1) copy of the Microsoft Instructor-Led Courseware.
You may not install the Microsoft Instructor-Led Courseware on a device you do not own or control.

e. If you are a Trainer.


i. For each license you acquire, you may install and use one (1) copy of the Trainer Content in the
form provided to you on one (1) Personal Device solely to prepare and deliver an Authorized
Training Session or Private Training Session, and install one (1) additional copy on another Personal
Device as a backup copy, which may be used only to reinstall the Trainer Content. You may not
install or use a copy of the Trainer Content on a device you do not own or control. You may also
print one (1) copy of the Trainer Content solely to prepare for and deliver an Authorized Training
Session or Private Training Session.
MCT USE ONLY. STUDENT USE PROHIBITED
ii. You may customize the written portions of the Trainer Content that are logically associated with
instruction of a training session in accordance with the most recent version of the MCT agreement.
If you elect to exercise the foregoing rights, you agree to comply with the following: (i)
customizations may only be used for teaching Authorized Training Sessions and Private Training
Sessions, and (ii) all customizations will comply with this agreement. For clarity, any use of
“customize” refers only to changing the order of slides and content, and/or not using all the slides or
content, it does not mean changing or modifying any slide or content.

2.2 Separation of Components. The Licensed Content is licensed as a single unit and you may not
separate their components and install them on different devices.

2.3 Redistribution of Licensed Content. Except as expressly provided in the use rights above, you may
not distribute any Licensed Content or any portion thereof (including any permitted modifications) to any
third parties without the express written permission of Microsoft.

2.4 Third Party Notices. The Licensed Content may include third party code tent that Microsoft, not the
third party, licenses to you under this agreement. Notices, if any, for the third party code ntent are included
for your information only.

2.5 Additional Terms. Some Licensed Content may contain components with additional terms,
conditions, and licenses regarding its use. Any non-conflicting terms in those conditions and licenses also
apply to your use of that respective component and supplements the terms described in this agreement.

3. LICENSED CONTENT BASED ON PRE-RELEASE TECHNOLOGY. If the Licensed Content’s subject


matter is based on a pre-release version of Microsoft technology (“Pre-release”), then in addition to the
other provisions in this agreement, these terms also apply:

a. Pre-Release Licensed Content. This Licensed Content subject matter is on the Pre-release version of
the Microsoft technology. The technology may not work the way a final version of the technology will
and we may change the technology for the final version. We also may not release a final version.
Licensed Content based on the final version of the technology may not contain the same information as
the Licensed Content based on the Pre-release version. Microsoft is under no obligation to provide you
with any further content, including any Licensed Content based on the final version of the technology.

b. Feedback. If you agree to give feedback about the Licensed Content to Microsoft, either directly or
through its third party designee, you give to Microsoft without charge, the right to use, share and
commercialize your feedback in any way and for any purpose. You also give to third parties, without
charge, any patent rights needed for their products, technologies and services to use or interface with
any specific parts of a Microsoft technology, Microsoft product, or service that includes the feedback.
You will not give feedback that is subject to a license that requires Microsoft to license its technology,
technologies, or products to third parties because we include your feedback in them. These rights
survive this agreement.

c. Pre-release Term. If you are an Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, MPN Member or Trainer, you will cease using all copies of the Licensed Content on
the Pre-release technology upon (i) the date which Microsoft informs you is the end date for using the
Licensed Content on the Pre-release technology, or (ii) sixty (60) days after the commercial release of the
technology that is the subject of the Licensed Content, whichever is earliest (“Pre-release term”).
Upon expiration or termination of the Pre-release term, you will irretrievably delete and destroy all copies
of the Licensed Content in your possession or under your control.
MCT USE ONLY. STUDENT USE PROHIBITED
4. SCOPE OF LICENSE. The Licensed Content is licensed, not sold. This agreement only gives you some
rights to use the Licensed Content. Microsoft reserves all other rights. Unless applicable law gives you more
rights despite this limitation, you may use the Licensed Content only as expressly permitted in this
agreement. In doing so, you must comply with any technical limitations in the Licensed Content that only
allows you to use it in certain ways. Except as expressly permitted in this agreement, you may not:
• access or allow any individual to access the Licensed Content if they have not acquired a valid license
for the Licensed Content,
• alter, remove or obscure any copyright or other protective notices (including watermarks), branding
or identifications contained in the Licensed Content,
• modify or create a derivative work of any Licensed Content,
• publicly display, or make the Licensed Content available for others to access or use,
• copy, print, install, sell, publish, transmit, lend, adapt, reuse, link to or post, make available or
distribute the Licensed Content to any third party,
• work around any technical limitations in the Licensed Content, or
• reverse engineer, decompile, remove or otherwise thwart any protections or disassemble the
Licensed Content except and only to the extent that applicable law expressly permits, despite this
limitation.

5. RESERVATION OF RIGHTS AND OWNERSHIP. Microsoft reserves all rights not expressly granted to
you in this agreement. The Licensed Content is protected by copyright and other intellectual property laws
and treaties. Microsoft or its suppliers own the title, copyright, and other intellectual property rights in the
Licensed Content.

6. EXPORT RESTRICTIONS. The Licensed Content is subject to United States export laws and regulations.
You must comply with all domestic and international export laws and regulations that apply to the Licensed
Content. These laws include restrictions on destinations, end users and end use. For additional information,
see www.microsoft.com/exporting.

7. SUPPORT SERVICES. Because the Licensed Content is “as is”, we may not provide support services for it.

8. TERMINATION. Without prejudice to any other rights, Microsoft may terminate this agreement if you fail
to comply with the terms and conditions of this agreement. Upon termination of this agreement for any
reason, you will immediately stop all use of and delete and destroy all copies of the Licensed Content in
your possession or under your control.

9. LINKS TO THIRD PARTY SITES. You may link to third party sites through the use of the Licensed
Content. The third party sites are not under the control of Microsoft, and Microsoft is not responsible for
the contents of any third party sites, any links contained in third party sites, or any changes or updates to
third party sites. Microsoft is not responsible for webcasting or any other form of transmission received
from any third party sites. Microsoft is providing these links to third party sites to you only as a
convenience, and the inclusion of any link does not imply an endorsement by Microsoft of the third party
site.

10. ENTIRE AGREEMENT. This agreement, and any additional terms for the Trainer Content, updates and
supplements are the entire agreement for the Licensed Content, updates and supplements.

11. APPLICABLE LAW.


a. United States. If you acquired the Licensed Content in the United States, Washington state law governs
the interpretation of this agreement and applies to claims for breach of it, regardless of conflict of laws
principles. The laws of the state where you live govern all other claims, including claims under state
consumer protection laws, unfair competition laws, and in tort.
MCT USE ONLY. STUDENT USE PROHIBITED
b. Outside the United States. If you acquired the Licensed Content in any other country, the laws of that
country apply.

12. LEGAL EFFECT. This agreement describes certain legal rights. You may have other rights under the laws
of your country. You may also have rights with respect to the party from whom you acquired the Licensed
Content. This agreement does not change your rights under the laws of your country if the laws of your
country do not permit it to do so.

13. DISCLAIMER OF WARRANTY. THE LICENSED CONTENT IS LICENSED "AS-IS" AND "AS
AVAILABLE." YOU BEAR THE RISK OF USING IT. MICROSOFT AND ITS RESPECTIVE
AFFILIATES GIVES NO EXPRESS WARRANTIES, GUARANTEES, OR CONDITIONS. YOU MAY
HAVE ADDITIONAL CONSUMER RIGHTS UNDER YOUR LOCAL LAWS WHICH THIS AGREEMENT
CANNOT CHANGE. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAWS, MICROSOFT AND
ITS RESPECTIVE AFFILIATES EXCLUDES ANY IMPLIED WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.

14. LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES. YOU CAN RECOVER FROM
MICROSOFT, ITS RESPECTIVE AFFILIATES AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP
TO US$5.00. YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL,
LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.

This limitation applies to


o anything related to the Licensed Content, services, content (including code) on third party Internet
sites or third-party programs; and
o claims for breach of contract, breach of warranty, guarantee or condition, strict liability, negligence,
or other tort to the extent permitted by applicable law.

It also applies even if Microsoft knew or should have known about the possibility of the damages. The
above limitation or exclusion may not apply to you because your country may not allow the exclusion or
limitation of incidental, consequential or other damages.

Please note: As this Licensed Content is distributed in Quebec, Canada, some of the clauses in this
agreement are provided below in French.

Remarque : Ce le contenu sous licence étant distribué au Québec, Canada, certaines des clauses
dans ce contrat sont fournies ci-dessous en français.

EXONÉRATION DE GARANTIE. Le contenu sous licence visé par une licence est offert « tel quel ». Toute
utilisation de ce contenu sous licence est à votre seule risque et péril. Microsoft n’accorde aucune autre garantie
expresse. Vous pouvez bénéficier de droits additionnels en vertu du droit local sur la protection dues
consommateurs, que ce contrat ne peut modifier. La ou elles sont permises par le droit locale, les garanties
implicites de qualité marchande, d’adéquation à un usage particulier et d’absence de contrefaçon sont exclues.

LIMITATION DES DOMMAGES-INTÉRÊTS ET EXCLUSION DE RESPONSABILITÉ POUR LES


DOMMAGES. Vous pouvez obtenir de Microsoft et de ses fournisseurs une indemnisation en cas de dommages
directs uniquement à hauteur de 5,00 $ US. Vous ne pouvez prétendre à aucune indemnisation pour les autres
dommages, y compris les dommages spéciaux, indirects ou accessoires et pertes de bénéfices.
Cette limitation concerne:
• tout ce qui est relié au le contenu sous licence, aux services ou au contenu (y compris le code)
figurant sur des sites Internet tiers ou dans des programmes tiers; et.
• les réclamations au titre de violation de contrat ou de garantie, ou au titre de responsabilité
stricte, de négligence ou d’une autre faute dans la limite autorisée par la loi en vigueur.
MCT USE ONLY. STUDENT USE PROHIBITED
Elle s’applique également, même si Microsoft connaissait ou devrait connaître l’éventualité d’un tel dommage. Si
votre pays n’autorise pas l’exclusion ou la limitation de responsabilité pour les dommages indirects, accessoires
ou de quelque nature que ce soit, il se peut que la limitation ou l’exclusion ci-dessus ne s’appliquera pas à votre
égard.

EFFET JURIDIQUE. Le présent contrat décrit certains droits juridiques. Vous pourriez avoir d’autres droits
prévus par les lois de votre pays. Le présent contrat ne modifie pas les droits que vous confèrent les lois de votre
pays si celles-ci ne le permettent pas.

Revised July 2013


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations xi
MCT USE ONLY. STUDENT USE PROHIBITED
xii Managing SQL Business Intelligence Operations

Acknowledgements
Microsoft Learning would like to acknowledge and thank the following for their contribution towards
developing this title. Their effort at various stages in the development has ensured that you have a good
classroom experience.

Aaron Johal – Content Developer


Aaron Johal is a Microsoft Certified Trainer who splits his time between training, consultancy, content
development, contracting and learning. Since he moved into the non-functional side of the Information
Technology business. He has presented technical sessions at SQL Pass in Denver and at sqlbits in London.
He has also taught and worked in a consulting capacity throughout the UK and abroad, including Africa,
Spain, Saudi Arabia, Netherlands, France, and Ireland. He enjoys interfacing functional and non-functional
roles to try and close the gaps between effective use of Information Technology and the needs of the
Business.

Caroline Eveleigh – Content Developer


Caroline Eveleigh is a Microsoft Certified Professional and SQL Server specialist. She has worked with SQL
Server since version 6.5 and, before that, with Microsoft Access and dBase. Caroline works on database
development and Microsoft Azure projects for both corporates, and small businesses. She is an
experienced business analyst, helping customers to re-engineer business processes, and improve decision
making using data analysis. Caroline is a trained technical author and a frequent blogger on project
management, business intelligence, and business efficiency. Between development projects, Caroline is a
keen SQL Server evangelist, speaking and training on SQL Server and Azure SQL Database.

Rachel Horder – Content Developer


Rachel Horder graduated with a degree in Journalism and began her career in London writing for The
Times technology supplement. After discovering a love for programming, Rachel became a full-time
developer, and now provides SQL Server consultancy services to businesses across a wide variety of
industries. Rachel is MCSA certified, and continues to write technical articles and books, including What's
New in SQL Server 2012. As an active member of the SQL Server community, Rachel organizes the Bristol
SQL Server Club user group, runs the Bristol leg of SQL Relay, and is a volunteer at SQLBits.

Simon Butler – Content Developer


Simon Butler FISTC is a highly-experienced Senior Technical Writer with nearly 30 years' experience in the
profession. He has written training materials and other information products for several high-profile
clients. He is a Fellow of the Institute of Scientific and Technical Communicators (ISTC), the UK
professional body for Technical Writers/Authors. To gain this, his skills, experience and knowledge have
been judged and assessed by the Membership Panel. He is also a Past President of the Institute and has
been a tutor on the ISTC Open Learning course in Technical Communication techniques. His writing skills
are augmented by extensive technical skills gained within the computing and electronics fields.

Geoff Allix – Technical Reviewer


Geoff Allix is a Microsoft SQL Server subject matter expert and professional content developer at Content
Master—a division of CM Group Ltd. As a Microsoft Certified Trainer, Geoff has delivered training courses
on SQL Server since version 6.5. Geoff is a Microsoft Certified IT Professional for SQL Server and has
extensive experience in designing and implementing database and BI solutions on SQL Server
technologies, and has provided consultancy services to organizations seeking to implement and optimize
database solutions.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations xiii

Lin Joyner – Technical Reviewer


Lin is an experienced Microsoft SQL Server developer and administrator. She has worked with SQL Server
since version 6.0 and previously as a Microsoft Certified Trainer, delivered training courses across the UK.
Lin has a wide breadth of knowledge across SQL Server technologies, including BI and Reporting Services.
Lin also designs and authors SQL Server and .NET development training materials. She has been writing
instructional content for Microsoft for over 15 years.
MCT USE ONLY. STUDENT USE PROHIBITED
xiv Managing SQL Business Intelligence Operations

Contents
Module 1: Introduction to Operational Management in BI Solutions
Module Overview 1-1 
Lesson 1: Rationale for BI Operations 1-2 

Lesson 2: Roles in BI Operations 1-6 

Lesson 3: Technologies Used in BI Operations 1-10 


Lesson 4: Environment and Operational Standards 1-14 

Lab: Introduction to Operational Management in BI Solutions 1-17 

Module Review and Takeaways 1-20 

Module 2: Configuring BI Components


Module Overview 2-1 

Lesson 1: The Importance of Standardized Builds 2-2 

Lesson 2: Configuration Considerations for BI Technologies 2-10 

Lesson 3: BI Architectures 2-23 

Lesson 4: SharePoint BI Environments 2-28 

Lab: Configuring BI Components 2-31 

Module Review and Takeaways 2-35 

Module 3: Managing Business Intelligence Security


Module Overview 3-1 

Lesson 1: Security Approach to BI Solutions 3-2 

Lesson 2: Security Components 3-8 

Lesson 3: Security Approach for BI Components 3-14 

Lesson 4: The Security Approach in Different BI Environments 3-20 

Lab: Managing Business Intelligence Security 3-22 

Module Review and Takeaways 3-26 

Module 4: Deploying BI Solutions


Module Overview 4-1 

Lesson 1: Application Life Cycle Management for BI Solutions 4-2 

Lesson 2: Stand-alone Deployments 4-5 

Lesson 3: Team-Based Deployments 4-14 

Lab: Deploying BI Solutions 4-19 

Module Review and Takeaways 4-23 


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations xv

Module 5: Logging and Monitoring in BI Operations


Module Overview 5-1 

Lesson 1: The Need for Logging and Monitoring 5-2 


Lesson 2: Logging Options 5-6 

Lesson 3: Monitoring Options 5-14 

Lesson 4: Setting Up Alerts 5-26 


Lab: Monitoring BI Solutions 5-30 

Module Review and Takeaways 5-34 

Module 6: Troubleshooting BI Solutions


Module Overview 6-1 
Lesson 1: Troubleshooting Failed BI Solutions 6-2 

Lesson 2: Troubleshooting the Data Warehouse 6-6 

Lesson 3: Troubleshooting SQL Server Analysis Services 6-10 

Lesson 4: Troubleshooting SQL Server Reporting Services 6-14 

Lab: Troubleshooting BI Solutions 6-17 

Module Review and Takeaways 6-20 

Module 7: Performance Tuning BI Queries


Module Overview 7-1 

Lesson 1: The Need for Performance Tuning 7-2 

Lesson 2: BI Queries to Performance Tune 7-5 

Lesson 3: Tools for Performance Tuning 7-10 

Lesson 4: Remediating Performance Issues 7-16 

Lab: Performance Tuning a BI Solution 7-21 

Module Review and Takeaways 7-24 

Lab Answer Keys


Module 1 Lab: Introduction to Operational Management in BI Solutions L01-1

Module 2 Lab: Configuring BI Components L02-1

Module 3 Lab: Managing Business Intelligence Security L03-1

Module 4 Lab: Deploying BI Solutions L04-1

Module 5 Lab: Monitoring BI Solutions L05-1

Module 6 Lab: Troubleshooting BI Solutions L06-1

Module 7 Lab: Performance Tuning a BI Solution L07-1


MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
About This Course i

About This Course


This section provides a brief description of the course, audience, suggested prerequisites, and course
objectives.

Course Description

Note: This course is released on the SQL Server 2017, and supersedes the B version based
on SQL Server 2016.

This three-day instructor-led course is aimed at database professionals who manage Business Intelligence
(BI) operations. This course looks at various options that provide the ability of business users to analyze
data and share their findings, starting with managed BI data sources and expanding to personal and
external/public data sources.

Audience
The primary audience for this course are business intelligence professionals.

The secondary audiences for this course are technically proficient business users.

Student Prerequisites
This course requires that you meet the following prerequisites:

 Basic knowledge of the Microsoft Windows operating system and its core functionality.
 Working knowledge of database administration and maintenance

 Working knowledge of Transact-SQL.

Course Objectives
After completing this course, students will be able to:

 Describe key features of a self-service BI solution.


 Describe the key capabilities of SQL Server BI in a SharePoint environment.

 Describe common Analysis Services operational tasks.

 Describe PowerPivot for SharePoint server.


 Describe Power Query.

 Describe Windows Azure HDInsight.

Course Outline
The course outline is as follows:

Module 1, “Introduction to Operational Management in BI Solutions”. Operational management of BI


solutions is on the increase. An organization’s need for information, coupled with the timely delivery of
this information, means that IT departments are placing as much emphasis on operational frameworks to
support BI solutions, as they are on the development outcomes.
Module 2, “Configuring BI Components” covers the correct configuration of BI Components within the
SQL Server stack.
Module 3, “Managing Business Intelligence Security” covers managing the security of data within the
organization.
MCT USE ONLY. STUDENT USE PROHIBITED
ii About This Course

Module 4, “Deploying BI Solutions” covers deploying BI solutions as part of the BI deployment lifecycle.
You are introduced to a number of tools and practices that can be used.
Module 5, “Logging and Monitoring in BI Operations” covers tools and practices to help the operations
team ensure the continued service of key applications that are used within the business.

Module 6, “Troubleshooting BI Solutions”. The task of trying to troubleshoot failed BI solutions can be
complex. It requires an understanding of the environments in which the BI solution is hosted, and an
understanding of the workloads that take place during the life cycle of the solution. Troubleshooting can
be made easier if the BI operations team has established defined standards for different tiers of servers for
the configuration, security, and deployment of the solution. Standards create a baseline environment for
the servers and the solution so that the BI operations team have a clear understanding of the environment
that they are troubleshooting.

Module 7, “Performance Tuning BI Queries” covers the BI Operations team working with the
development team to performance tune queries.

Course Materials
The following materials are included with your kit:
 Course Handbook: a succinct classroom learning guide that provides the critical technical
information in a crisp, tightly-focused format, which is essential for an effective in-class learning
experience.

o Lessons: guide you through the learning objectives and provide the key points that are critical to
the success of the in-class learning experience.
o Labs: provide a real-world, hands-on platform for you to apply the knowledge and skills learned
in the module.

o Module Reviews and Takeaways: provide on-the-job reference material to boost knowledge
and skills retention.
o Lab Answer Keys: provide step-by-step lab solution guidance.

Additional Reading: Course Companion Content on the


http://www.microsoft.com/learning/en/us/companion-moc.aspx Site: searchable, easy-to-
browse digital content with integrated premium online resources that supplement the Course
Handbook.

 Modules: include companion content, such as questions and answers, detailed demo steps and
additional reading links, for each lesson. Additionally, they include Lab Review questions and answers
and Module Reviews and Takeaways sections, which contain the review questions and answers, best
practices, common issues and troubleshooting tips with answers, and real-world issues and scenarios
with answers.
 Resources: include well-categorized additional resources that give you immediate access to the most
current premium content on TechNet, MSDN®, or Microsoft® Press®.

Additional Reading: Student Course files on the


http://www.microsoft.com/learning/en/us/companion-moc.aspx Site: includes the
Allfiles.exe, a self-extracting executable file that contains all required files for the labs and
demonstrations.
MCT USE ONLY. STUDENT USE PROHIBITED
About This Course iii

 Course evaluation: at the end of the course, you will have the opportunity to complete an online
evaluation to provide feedback on the course, training facility, and instructor.
o To provide additional comments or feedback on the course, send an email to
[email protected]. To inquire about the Microsoft Certification Program, send an email to
[email protected].

Virtual Machine Environment


This section provides the information for setting up the classroom environment to support the business
scenario of the course.

Virtual Machine Configuration


In this course, you will use Microsoft® Hyper-V® to perform the labs.

Note: At the end of each lab, you must revert the virtual machines to a snapshot. You can
find the instructions for this procedure at the end of each lab

The following table shows the role of each virtual machine that is used in this course:

Virtual machine Role

10988C-MIA-DC MIA-DC1 is a domain controller

10988C-MIA-SQL MIA-SQL has SQL Server 2017 installed

MT17B-WS2016-NAT MT17B-WS2016-NAT is used to access the internet

Software Configuration
The following software is installed on the VMs:

 Microsoft Windows Server 2012 R2

 Microsoft SQL Server 2017

 Microsoft SharePoint Server 2016

 Microsoft Visual Studio 2017

 Microsoft Team Foundation Server 2018 Express

Course Files
The files associated with the labs in this course are located in the D:\Labfiles folder on the 10988C-MIA-
SQL virtual machine.

Classroom Setup
Each classroom computer will have the same virtual machine configured in the same way.
MCT USE ONLY. STUDENT USE PROHIBITED
iv About This Course

Course Hardware Level


To ensure a satisfactory student experience, Microsoft Learning requires a minimum equipment
configuration for trainer and student computers in all Microsoft Learning Partner classrooms in which
Official Microsoft Learning Product courseware is taught.

 Processor: Intel Virtualization Technology (Intel VT) or AMD Virtualization (AMD-V)

 Hard Disk: Dual 120 GB hard disks 7200 RM SATA or better (Striped)

 RAM: 12GB or higher. 16 GB or more is recommended for this course.

 DVD/CD: DVD drive

 Network adapter with Internet connectivity

 Video Adapter/Monitor: 17-inch Super VGA (SVGA)

 Microsoft Mouse or compatible pointing device

 Sound card with amplified speakers

Additionally, the instructor’s computer must be connected to a projection display device that supports
SVGA 1024×768 pixels, 16-bit colors.
MCT USE ONLY. STUDENT USE PROHIBITED
1-1

Module 1
Introduction to Operational Management in BI Solutions
Contents:
Module Overview 1-1 
Lesson 1: Rationale for BI Operations 1-2 

Lesson 2: Roles in BI Operations 1-6 

Lesson 3: Technologies Used in BI Operations 1-10 


Lesson 4: Environment and Operational Standards 1-14 

Lab: Introduction to Operational Management in BI Solutions 1-17 

Module Review and Takeaways 1-20 

Module Overview
Operational management of BI solutions is on the increase. An organization’s need for information,
coupled with the timely delivery of this information, means that IT departments are placing as much
emphasis on operational frameworks to support BI solutions, as they are on the development outcomes.
With the development of BI solutions complete, the right processes and people should be in place to
ensure that the solution delivers. You should also use supporting technologies to ensure smooth
operations. Furthermore, developing a supporting logging and troubleshooting framework can aid the
debugging and resolution of BI issues.
The aforementioned are brought together within a single operational management framework that
enables a cohesive and proactive approach to managing the BI solution in the production environment. It
also ensures the continued operation of the solution, while providing a structured approach to solving BI
issues.

Objectives
After completing this module, you will be able to:
 Describe the rationale for BI operations.

 Describe roles in BI operations.

 Describe the technologies used in BI operations.


 Describe environments and operational standards.
MCT USE ONLY. STUDENT USE PROHIBITED
1-2 Introduction to Operational Management in BI Solutions

Lesson 1
Rationale for BI Operations
Do you find that your organization suffers from long running reports? Does your BI processing continue
to run into core business hours? Is your team responsive to BI failures? Do your business users and
management complain about the availability of information?

Regardless of the answers, these questions highlight the increasing need for IT departments to become
more proactive when dealing with the obstacles that prevent business users from accessing information.
Information is seen as a valuable asset to an organization. Therefore, you need to ensure the continued
availability of this information, and that the supporting platforms operate and deliver the information,
based on the business requirements.

Lesson Objectives
After completing this lesson, you will be able to:
 Describe the importance of BI to a business.

 Describe proactive response to issues.

 Describe operational frameworks.


 Describe business continuity planning.

The Importance of BI Growth in the Business


BI should enable decision makers to make better
and more informed decisions through data.
Harvesting meaningful information from an
organization’s data is becoming increasingly
important. There is now more expectation on
business users to show that the decisions that they
make are based on the data at their disposal.

There are many reasons why a business might take


this approach, including:
 Gaining a competitive advantage.

 Looking at achieving cost savings.

 Optimizing delivery times of products or services to customers.

 Understanding customer buying behavior.

Not having data available to facilitate decision-making is seen as an obstacle, and this is further
compounded as businesses move to more self-service BI delivery models. Therefore, IT departments not
only need to develop more robust BI solutions, but should also provide a support model that ensures that
BI data is readily available. There must also be processes in place that can respond to BI failures, or poorly
performing BI solutions.

Organizations now recognize that they need to manage BI operations and place more emphasis on
ensuring smooth deployments, timely processing of the BI solution, and the ability to respond to BI
failures. The way this can be implemented varies between businesses. Small organizations might take a
specific, one-off approach, whereas larger organizations will have a more structured approach that might
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-3

even be part of an operational framework. However, there are common elements in each approach that
ensure you meet any preset service level agreements.

Proactive Response to Issues


Planned and unplanned disruptive events will
happen. How an IT department responds to these
events is a measure of the effectiveness of the team
that manages the BI operations.
When managing BI operations, you should first
identify the planned events that could cause
disruption. These include:

 Applying service packs or cumulative updates.

 Offline index rebuilds.

 Analysis Services cube processing.

 Data warehouse loads.

Setting an appropriate schedule and business expectations for such events by using a signed-off
agreement is critical. Typically, such agreements form the basis of service level agreements in large
organizations.

When unplanned disruptive events occur, it is important to fix the issue first. However, after the issue is
resolved, a root cause analysis should be performed. Performing such an activity means you can define
the procedures that should be in place to respond to the issue in the future. In addition, you can tell the
organization’s management about the cause of the issue so that they can use the experience to better
deal with any similar issues that might happen in the future.

Operational Frameworks
Operational frameworks supply IT services with
practical guidance for best practices, principles, and
activities to the planning, delivery, and operations
of IT. Operational frameworks are also concerned
with how the planning, delivery and operations are
managed.
Because this course is concerned with the
operational management in the context of a BI
solution, a number of questions have to be
considered, including:

 What monitoring should be in place for the BI


solution?

 What channels can users use to report a problem?

 What procedures are in place for resolving known issues?

 What procedures are in place for resolving unknown issues?

 How are the procedures categorized by the business?


MCT USE ONLY. STUDENT USE PROHIBITED
1-4 Introduction to Operational Management in BI Solutions

 How is problem management communicated to the business?

 What measure is used to determine that a problem is resolved?

The answers to these questions will ultimately consist of a combination of people, technologies and
processes that provide a cohesive approach to managing BI operations. The objective is to deliver reliable
and stable BI environments for the business, and to resume normal service if the BI solution becomes
unavailable. An example of an operational framework is the Microsoft Operations Framework.
Comprehensive coverage of the framework can be found at Microsoft Operations Framework on Microsoft
TechNet:

Microsoft Operations Framework


http://aka.ms/w5dju2

Business Continuity Planning


Business continuity planning is a holistic approach
that is used by organizations to assess and mitigate
any event that could prevent the business from
performing normal operations. It involves making a
plan to ensure that you communicate with the
business about how it intends to run after a
disaster. The details can range from IT to non-IT
related events. The output of this exercise is to
define a Business Continuity Planning document
that can be used throughout the company.
First, it is important for a non-IT member of staff to
identify the critical business processes that are
required to ensure normal business operations. As a data professional, you can identify the IT assets that
support that business process, followed by the associated data that is generated by the IT assets. This
means you can identify which data stores and supporting technologies are on the critical path for
supporting the business process. An analysis of the critical systems and data should be undertaken to
understand the impact when these systems and data are not available to the business. This is normally
expressed in terms of a monetary value, based on the unavailability of the business process.

In the impact analysis, the terms restore point objective (RPO) and restore time objective (RTO) come to
the surface. Many SQL Server® professionals now relate these terms to when a SQL Server database
should be restored and how long that might take. However, when it comes to defining RPO and RTO,
operational management takes a more holistic approach.
In operational management, the RPO refers to the amount of acceptable data loss in the event of a critical
system going down. RTO refers to the time it takes a business to resume normal operations from the
moment a disaster occurs, to the moment normal business operations are reconfirmed by the business
stakeholders.

BI solutions are increasingly becoming a critical part of business processes. Therefore, it is important to
work with the business to identify the critical business processes, and how the BI solution relates to those
business processes. From this, you can adapt the operational framework to define service level
agreements on the availability of these critical systems, in addition to adapting the supporting operational
framework to ensure that priority is given to these systems. Finally, you will provide the business with an
appropriate contribution to the Business Continuity Planning document.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-5

Question: Does your organization manage its BI environment within an operational


framework? What are the advantages and disadvantages?
MCT USE ONLY. STUDENT USE PROHIBITED
1-6 Introduction to Operational Management in BI Solutions

Lesson 2
Roles in BI Operations
Managing BI operations consists of a combination of people, technologies, and processes that provide a
cohesive approach to BI solutions. In this lesson, you will explore the people who contribute to the
effectiveness of the operational support in a BI solution. In large organizations, one or more individuals
might fill these roles. In smaller organizations, one person might perform multiple roles.

Lesson Objectives
After completing this lesson, you will be able describe how the following roles can contribute to an
effective operational management framework for a BI solution. The roles include:

 Data directors/managers

 BI architects

 BI developers

 Database administrators

 Business users

Data Directors/Managers
More organizations are employing the skills of data
directors and/or managers to oversee the strategic
direction and support of the data that is generated
within the business. A BI director will work with a
company’s board of directors to understand the
overall business strategy and direction, and
articulate how the data within the business can add
value to meeting the needs of the business strategy.

The business value can be provided by internal


systems such as transactional or BI data, or from
external open data that can complement the
existing data stores. Solutions can be developed
and deployed to deliver the value required to meet the strategic business needs.

In addition, a support model for the data has to be created. To ensure ongoing operations, BI
directors/managers will devise strategic plans for supporting the solutions that have been developed. This
will involve the management of the supporting technologies, defining the appropriate roles to support
the data, and defining processes that are understood by everyone in the organization. The BI director will
work with the board to set the relevant Service Level Agreement for a given BI solution.

In the context of a BI solution, common technologies that might be supported are SQL Server Database
Engine, Integration Services, Analysis Services and Reporting Services. More recently, technologies such as
Master Data Services and Data Quality Services have gained wider adoption within BI solutions.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-7

BI Architects
A BI architect is a top-level analyst who will take
direction from the data director and create a BI
architecture that will meet the strategic needs of
the business. The BI architect will define a BI
architecture that provides a framework, using
technologies to gather and store data for reporting
and analytical purposes. BI architectures will vary,
depending on the direction that has been given by
the data director.

Key requirements that will affect a BI architecture


include:

 Functional Requirements. This type of


requirement will define the purpose of the BI solution and the expected results. For example, a BI
solution might be required by a retail business that enables its staff to analyze and report the sales
amount and quantities by branch, region, or product.
Typically, functional requirements are developed into a workable solution that provides the required
information to the users—this is often performed by BI developers who will develop a solution to
meet the functional requirement.

 Nonfunctional Requirements. These requirements deal with how the solution will operate. Some of
the areas that are considered include:

o Availability
o Disaster recovery

o Backup

o Maintainability
o Configuration

o Responsiveness

Many of the nonfunctional requirements will be managed and monitored within an operational
framework.

BI architectures are typically very focused on solution architectures. In other words, the architecture is
created on the basis of providing a solution to meet a specific need. It is important that a BI architecture
be considered in the context of an enterprise solution architecture. This ensures that the solution takes
advantage of existing technologies that are used within the business, and more importantly, that the
solution is compliant. This might require the BI architect to seek input from another team that manages
the enterprise architecture.
MCT USE ONLY. STUDENT USE PROHIBITED
1-8 Introduction to Operational Management in BI Solutions

BI Developers
BI developers will typically deliver most of the
functional requirements of a BI solution. Much of
the work involves:

 Creating a data warehouse within the SQL


Server Database Engine.

 Developing an extract, transform and load (ETL)


process using SQL Server Integration Services
(SSIS).

 Developing data models for analysis using SQL


Server Analysis Services (SSAS).

 Creating reports using SQL Server Reporting


Services (SSRS).

 Managing data quality with SQL Server Data Quality Services (SSDQS).

 Mastering data with SQL Server Master Data Services (SSMDS).


Developing a BI solution to deliver the data as per the functional requirements is often used as a measure
for success. There is an opportunity for developers to aid the operational management of the solutions
they create by adding the following into their solutions:

 Integration Services logging to report ETL failures.

 Reporting Services execution logs to view report executions.

 Analysis Services logging to aid debugging or query processing.

 Embedded Management Report Packs to consolidate operational reports.


These options are not mandatory, and a BI solution would still operate without them. However, adding
this functionality to the BI solution will help those who have to manage the BI operations by providing
information on a SQL Server component that might be experiencing issues.

Database Administrators
Database administrators (DBAs) are usually the data
professionals who will support a BI solution after it
has been deployed to the production environment.
This work can involve them being on call to support
the mission critical systems within the business—24
hours a day, seven days a week. Should an
organization’s BI solution be deemed mission
critical, then it is highly likely that the responsibility
for the operational management of the solution will
fall with the DBA team.

In these circumstances, it is highly advisable to take


advantage of the DBA’s expertise at the design and
planning phase of a BI solution. DBAs can advise on the appropriate servers that will be required for the
solution, and how they should be configured. In addition, DBAs can advise on the existing SQL Server
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-9

estate and how the solution can fit into that setup within the organization. They can also advise the BI
development team on the type of logging information that would be useful to help them solve errors.

There are organizations that do not employ a dedicated DBA but the activities for supporting a mission
critical BI solution will still remain. The same information regarding server configuration and support will
have to be managed. Therefore, it is important that these aspects are covered by the BI architect with the
support of the BI developers—and that support for the solution, after it has been produced, is established
early in the planning and design.
Some organizations might outsource the DBA service to third-party managed organizations—you should
liaise with the partner to determine what would come under the scope of support for the BI solution.

BI Consumers
BI consumers are regarded as the individuals in the
business who will use the data from the BI solution
for reporting and analytical purposes. From an
operational management perspective, they will
likely be one of the first channels to alert the BI
operations team to problems that are occurring
with the system.

A process should be set up so that users can report


the problem, monitor progress of a problem being
resolved, and receive confirmation that an error has
been fixed. Microsoft System Center is a technology
you can use to manage problems by implementing
the help desk ticketing system that is found in the Service Manager component.

By using Service Manager in Microsoft System Center, users can log help desk support tickets to report
incidents and change incidents for a range of IT systems and solutions. IT departments can update
incidents and track the progress of the work being done. Having a system in place provides a
consolidated view of the issues that are affecting services within the IT infrastructure. Appropriate
responses can then be made to ensure that the problem is resolved; or an update is provided for business
users.

Question: Does your organization have similar roles to the ones outlined in this lesson?
MCT USE ONLY. STUDENT USE PROHIBITED
1-10 Introduction to Operational Management in BI Solutions

Lesson 3
Technologies Used in BI Operations
A range of technologies can be used within BI solutions. Much of the focus in using these technologies is
to deliver the functional requirements of a BI solution through BI development. However, the
technologies used also provide functionality that can be valuable when managing the operations of a BI
solution.

Lesson Objectives
After completing this lesson, you will be able to:

 Describe on-premises technologies.

 Describe cloud technologies.

 Describe Visual Studio®.

 Describe Team Foundation Server.

On-Premises Technologies
Several technologies used within a BI solution
contain functionality that can help with BI
operations. These technologies include:
Operating System. This provides the platform on
which the BI solution will reside. It is often an
overlooked component when trying to solve errors,
or understand performance bottlenecks. Windows
Server® contains many tools that can give you
information regarding the operating system itself,
and the hardware. Examples of the supplied tools
that can help include:

o Event Viewer

o Windows Reliability Performance Monitor

o System Information

SQL Server Database Engine. The database engine provides the data repository that holds the data
warehouse and other supporting databases. When there are errors or performance bottlenecks, a number
of tools can be used to identify issues, including:

o Activity Monitor

o Dynamic Management Views

o Extended Events

o SQL Server Profiler


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-11

SQL Server Integration Services. SSIS provides the functionality to identify any ETL processes that have
failed, or that are long running. Some tools that can be used include:

o The SSISDB Catalog

o SSIS Logging

o Event Handlers
SQL Server Analysis Services. SSAS provides a wide range of tools with which you can monitor the
processing and query performance of data models, including:

o SQL Server Profiler

o Query Logging

o Error Logging

o Flight Recorder

SQL Server Reporting Services. SSRS contains functionality that you can use to identify errors or
performance issues by using:

o Reporting Services Logging


o Execution Logs

SQL Server Master Data Services. There is the option to enable Master Data Services logging through
tracing. This is enabled through the web.config file for Master Data services, requiring modification to the
file before it can be used.

SQL Server Data Quality Services. You can enable log setting in Data Quality Services to track any
operational issues in the Data Quality Services server, Data Quality client and the Data Cleansing task that
is used within SSIS.

Cloud Technologies
Microsoft Azure™ provides an increasing range of
services that can support BI, reporting, and
analytical solutions. Cloud solutions are being
considered by more organizations, particularly
when the hardware within their own premises
becomes defunct. The following are some examples
of services that are available to provide a cloud-
based BI solution:

Azure SQL Data Warehouse. An enterprise class,


distributed database that can deal with the storage
of both relational and nonrelational data, and scale
to terabyte levels through its Massively Parallel
Processing (MPP) capability.

Azure Data Factory. This manages the movement and orchestration of data from both on-premises and
cloud data sources.

Azure Data Lake. This provides the ability to perform data analytics that can scale to terabytes of data
using the U-SQL query language that is an extension of the SQL language—with additional C# capability
to perform distributed analytics.
MCT USE ONLY. STUDENT USE PROHIBITED
1-12 Introduction to Operational Management in BI Solutions

Power BI. This enables you to expose both cloud and on-premises data to create dashboards and rich
visualizations.

Other technologies that can be used within a cloud data architecture include:

o Azure VMs that can host SQL Server components

o SQL Database
o HDInsight®

o Machine Learning

o Streaming Analytics

o Event Hubs

o Azure Data Catalog

o Azure Search

o Cortana® Analytics Suite

Microsoft Azure services capture information regarding the compute and storage usage of each service—
known as telemetry. Organizations can expose some of the data in this telemetry to provide operational
insights that enable them to manage the systems.

Within Power BI, you can make use of content packs—these are prepackaged reports and dashboards,
provided by both Microsoft and third-party suppliers, that can enable you to quickly deploy reports and
dashboards on a range of areas. At the time of writing, Power BI provides the following content packs that
can help with operational management:

 Microsoft Azure Enterprise content pack.

 SQL Database Auditing content pack.

Visual Studio
Visual Studio provides an integrated environment
for developing a wide range of applications and
cloud services for Windows, iOS and Android
platforms. It is also an environment that can be
used for developing BI solutions, including:

 SSIS packages

 SSAS data models

 SSRS reports

 SQL Server database solutions


From a BI operations aspect, you can utilize Visual
Studio to create builds and manage deployments.
Builds enable you to store different versions of a project within a solution or project properties. It enables
you to assign an active build, which is the build that will be opened when a Visual Studio project is
opened. You can then create different build versions as the project evolves and progresses.

You can also set a build to the Debug configuration—this denotes to Visual Studio that the code is to be
used for the purpose of debugging. Alternatively, you can set the code to Release configuration, which
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-13

denotes to Visual Studio that the code is ready for a final release. You can also set a build to both Debug
and Release configuration at the same time. Using these settings within Visual Studio can help manage
the releases of the code onto a local desktop or server.

The issue with using Visual Studio alone is that it does not effectively support BI development projects
where BI solutions are developed by multiple team members. However, this approach could be used for BI
solutions that are developed by a single BI professional.

Team Foundation Server


Team Foundation Server (TFS) addresses the issue
of managing application development across a
team of multiple developers. There is an on-
premises version of the product, and a cloud
version that is named Visual Studio Team Services.

The key highlights of TFS are:

 It enables multiple developers to work on the


same code base.

 It enables versioning of code.


 It enables the ability to roll back code.

 It provides the ability to centrally manage build and releases.

 It includes automation capabilities.


From an operational perspective, the BI operation team will likely become involved in the release
management phase of the Application Lifecycle Management that TFS can facilitate. This will involve the
use of builds and releases, as is the case with Visual Studio—the difference being that all of the
developers’ code is held centrally, so that the release management can occur from a single source.

Visual Studio requires the add-in Team Explorer to be installed to make use of a TFS. This enables
developers to connect to the instance of a TFS, and then create or select a Team Project Collection to host
multiple projects holding the files relating to the same code base. After connecting to a TFS, a local
workspace area should be defined on the BI developer’s desktop. This enables the project files for the TFS
server to be downloaded locally. Any changes that are made to the project files occur locally until the files
are checked into TFS.

A BI developer can add a solution to source control in the New Project dialog box. Within this dialog box
is the option to add the solution to source control. When clicking Save, the developer will be asked which
TFS server to save the project to, and then select the relevant project collection. After this has been
checked in, BI objects can be developed in the normal way—the difference being that the BI developer
must check in any saved work to the TFS server so that it is committed to the TFS database. BI developers
can see and browse the solution files that are stored in the TFS server by using Source Control Explorer.

Question: How many students in the room use Team Foundation Server in their BI solutions?
MCT USE ONLY. STUDENT USE PROHIBITED
1-14 Introduction to Operational Management in BI Solutions

Lesson 4
Environment and Operational Standards
Managing BI operations is more than just responding to errors and performance bottlenecks. The
operational framework should try to take proactive steps to ensure that the team is dealing with
environments that are well understood, and procedures that are standardized. Taking steps to define
environments and procedures will ensure that the operations team knows how to respond to issues in
known environments in a timely manner.

Lesson Objectives
After completing this lesson, you will be able to:

 Describe production vs. nonproduction environments.

 Describe standard operating procedure.

 Describe emergency operating procedures.

Production vs. Nonproduction Environments


Production environments describe the servers and
services that will host the BI solutions that are put
into operation for use by all of the BI consumers.
Nonproduction environments are there to facilitate
the development, testing and acceptance of the BI
solutions while they are being developed. As a
result, there might be different types of
nonproduction environments—these could include:

 Development environments. For exclusive use


of the BI development team. In this
environment, it is typical for the developers to
have full administrative rights to the operating
system and SQL Server instances so they can develop solutions productively.

 Test environments. Used by a smaller subset of developers and a testing team to perform unit
testing of BI functionality and performance. In this environment, developers will have more restrictive
permissions to the servers—they might only be allowed to deploy the BI solution to the testing server.
 User Acceptance Testing (UAT) environments. UAT environments will be used by a select pool of
trusted BI consumers and members of the testing team. This ensures that the entire BI solution meets
the functional and nonfunctional requirements that have been captured by the BI architects. The
intention is that testing is completed successfully for sign-off into the production environment.

The preceding terms represent some of the common terms that are used to describe nonproduction
environments. Some organizations may label these environments with different names, while other
organizations might use additional environments.

It is the responsibility of the BI operations team to ensure the continued operations of production
environments—to service the business user’s information requests—and the nonproduction environments
to ensure that developers and testers can continue to develop new BI functionality.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-15

When working with multiple servers and environments, the BI operations team would be more effective
by defining:

 Build standards for production and nonproduction servers.

 Configuration standards for the operating system and SQL Server components.

 Security standards for each environment.


 Deployment processes for deploying BI solutions.

 Monitoring standards for the operating system and SQL Server components.

This will formulate part of the operational framework under which the BI team will operate.

Standard Operating Procedures


A standard operating procedure (SOP) is a
documented process which, with the agreement of
the business, is to be performed regularly to
support the continued operations of a server or a
service. There are many SQL Server operations that
fall under the scope of a SOP, including:

 Backups
 Index maintenance

 Cube processing

 Data warehouse loading

Defining a SOP is not necessarily only concerned with the technical detail of the work that is being
undertaken. You will also need to consider the business justification for the action, the time and frequency
the action will be executed, and the expected time it takes for the operation to be completed. Having a
SOP for these recurring activities will inform the BI operation team of what is occurring on the servers at
any given time.

Occasionally, a SOP can also be applied to activities that may not necessarily take place on a recurring
basis. For example, a business might experience performance problems with a production server. This
might occur at random times, and the BI operations team may know how to resolve the issue without
understanding the root cause. Therefore, a SOP can be defined to perform an activity to resolve the issue,
but it is executed on a given event occurring on the server, rather than at a given time.

What benefits do SOPs provide? SOPs inform the BI operations team of the ongoing procedures on a
server. A SOP advises on the actions to take when a known issue arises and can give the BI operations
team the ability to perform these actions without the formal sign-off from a data director or manager—
because the procedure has been agreed in advance of the work being undertaken.
MCT USE ONLY. STUDENT USE PROHIBITED
1-16 Introduction to Operational Management in BI Solutions

Emergency Operating Procedures


Emergency operating procedures (EOP) are
performed in the event of a serious disruption to
the operations of a BI solution. Examples of a
serious disruption include:

 A server becoming unavailable.

 Reports timing out on execution.

 BI loads failing to complete.

 Missing data models or reports.


In such circumstances, the cause of the problem is
usually unknown. Therefore, the team should
perform a number of actions:

1. Inform management of the outage.


2. Perform an investigation of the issue.

3. Present findings and a proposed solution to fix the issue.

4. Seek sign-off from management to perform the fix.

5. Apply the fix.

6. Confirm the fix resumes normal operations.

7. Inform the management that the service is resumed.


8. Inform the business that the service is resumed.

9. Perform a root cause analysis.

10. Optionally, define any SOPs that can be used in the future.
There are a number of objectives in taking this approach:

 To resolve the issue as soon as possible.

 To provide updates to the management and business regarding the outage.


 Understand the root cause of the disruption.

 To mitigate future disruption by applying a SOP.

Question: Do you see any advantages to using standard or emergency operating


procedures?
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-17

Lab: Introduction to Operational Management in BI


Solutions
Scenario
Adventure Works Cycles is a global corporation that manufactures and sells bicycles and accessories. The
company sells through an international network of resellers, and has a direct sales channel through an e-
commerce website.

Adventure Works employees are increasingly frustrated by the time it takes for business reports to
become available on a daily basis. The existing managed BI infrastructure—including data warehouses,
enterprise data models, and reports and dashboards—are valued sources of decision-making information.
However, users are increasingly finding that it takes too long for the data to be processed in the overnight
load, resulting in reports not arriving to business users until the early afternoon.

Objectives
After completing this lab, you will be able to:

 Identify the roles required to support BI operations.

 Use Visual Studio Team Explorer to work with team-based BI solutions.

Estimated Time: 45 minutes

Virtual machine: 10988C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa55w.rd

Exercise 1: Roles in BI Operations


Scenario
Adventure Works has identified that there must be a more structured approach to managing the IT assets
that support the BI operations within the business. You have also been tasked with improving the
operational support for the BI solution by identifying areas of responsibilities and the roles required.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Review the Transcript

3. Identify Roles to Support BI Operations

 Task 1: Prepare the Lab Environment


1. Read the lab and exercise scenarios.
2. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

3. Run Setup.cmd in the D:\Labfiles\Lab01\Starter folder as Administrator.

 Task 2: Review the Transcript


 Use WordPad to review the Adventure Works employee interviews in Interviews.docx in the
D:\Labfiles\Lab01\Starter folder.
MCT USE ONLY. STUDENT USE PROHIBITED
1-18 Introduction to Operational Management in BI Solutions

 Task 3: Identify Roles to Support BI Operations


1. Collaborate with two or three other students.

2. Use Roles.docx in the D:\labfiles\Lab01\Starter folder as a framework to identify the roles,


employees and responsibilities, based on the interviews.

3. Close both documents.

Results: At the end of this exercise, you should have created a table that shows the roles required, with a
named employee who has key responsibilities.

Exercise 2: Using Team Explorer in Visual Studio


Scenario
To improve the delivery and deployment of BI solutions within Adventure Works, a Team Foundation
Server (TFS) has been installed. You need to create the project collection to host a BI solution file with TFS
so that the team can work on BI development objects at the same time.

The main tasks for this exercise are as follows:

1. Connecting to a Team Foundation Server

2. Create a Team Project Within Team Foundation Server

3. Create an Integration Services Project in Team Foundation Server

 Task 1: Connecting to a Team Foundation Server


1. Start Visual Studio 2017 and connect to the MIA-SQL Team Foundation Server.
2. Select the Team Collection project AdventureWorks BISolutions.

 Task 2: Create a Team Project Within Team Foundation Server


1. Use Visual Studio 2017 to create a new team project.
2. Name the project Adventure Works, specifying the Scrum process template, and Team Foundation
version control.

 Task 3: Create an Integration Services Project in Team Foundation Server


1. Use Visual Studio 2017 to create a new Integration Services project.

2. Name the project AWMigration, in a solution named AWMig, and then save it in the
D:\Labfiles\Lab01\Starter folder, and store in Source Control.

3. Rename Package.dtsx to AWMig_Control.dtsx, and check the solution in to TFS.

4. Confirm that the solution AWMig.sln has been added to TFS using Source Control Explorer.

Results: At the end of the exercise, you will have configured Team Explorer to connect to a TFS server
named mia-sql. You will have created a project collection and stored an Integration Services project
within the project collection in the TFS server. You will have made a change to an object and checked the
object back in to TFS. Finally, you will view the changes in Source Control Explorer.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-19

Question: Based on the interviews in the lab, discuss the findings of the group regarding the
role assignments and the responsibilities of each role. Are there any roles missing?

Question: Based on the interview document, how would you improve the BI developer’s
current working environment?
MCT USE ONLY. STUDENT USE PROHIBITED
1-20 Introduction to Operational Management in BI Solutions

Module Review and Takeaways


In this module, you have been introduced to the operational management of BI solutions. You should
now be able to:

 Describe the rationale for BI operations.

 Describe roles in BI operations.

 Describe the technologies used in BI operations.

 Describe environments and operational standards.


MCT USE ONLY. STUDENT USE PROHIBITED
2-1

Module 2
Configuring BI Components
Contents:
Module Overview 2-1 
Lesson 1: The Importance of Standardized Builds 2-2 

Lesson 2: Configuration Considerations for BI Technologies 2-10 

Lesson 3: BI Architectures 2-23 


Lesson 4: SharePoint BI Environments 2-28 

Lab: Configuring BI Components 2-31 

Module Review and Takeaways 2-35 

Module Overview
The correct configuration of the BI components within the SQL Server® product stack will have a big
impact on the stability and performance of the overall BI solution. Configuring components by using best
practice will enable the BI operations team to rule out the data platform as a root cause of issues that
occur within the environment.

Defining standards for server builds can help the effectiveness of the team to resolve issues in a known
configured environment. Equally, understanding the type of architecture that a BI solution is implemented
in will drive the standards for a given architecture.

Objectives
After completing this module, you will be able to:

 Describe the importance of standardized builds.

 Describe the configuration considerations for BI technologies.

 Describe BI architectures.

 Describe SharePoint® BI environments.


MCT USE ONLY. STUDENT USE PROHIBITED
2-2 Configuring BI Components

Lesson 1
The Importance of Standardized Builds
Standardized builds is the process of documenting and implementing a server or software configuration
for a given technology. It encourages specific configurations to be implemented using best practice, and
provides the following benefits:

 It ensures that the server and/or software is configured in an optimal way.

 It provides a known configuration for a server and/or software for a given environment.

 It ensures a consistent configuration across multiple servers that an operational team can support.

 It can improve the effectiveness of an operational team reacting to an issue within an environment.
Standardization is used in many aspects of a business. Applying standardization to a BI operational model
will enable a business to achieve efficiencies when supporting the solution operationally.

Lesson Objectives
At the end of this lesson, you will be able to:

 Establish business requirements.

 Describe hardware standards.

 Describe software standards.

 Describe meeting availability and disaster recovery requirements.

Establishing Business Requirements


Business requirements drive the standards for the
servers and software that are to be used within a
BI solution. From an operational perspective,
configuring BI components must meet the
nonfunctional requirements of the BI solution and
include the following:

Availability

Availability refers to the amount of time that


system resources are available and is an important
element of the nonfunctional requirements. The
business needs to articulate how available the
solution should be because some organizations
may not be concerned with high availability. Some organizations might require a system to be available
only during normal business hours; others may require high availability 24 hours a day, seven days a week.

Availability can be implemented at a hardware level and include hardware solutions such as redundant
array of independent disks (RAID) to protect against hard disk failure, and dual power supplies to provide
redundancy in power. Microsoft® Windows® and SQL Server also include features that provide
redundancy at the server and database level, including:

 Windows Server® Failover Clustering.

 Always On Availability Groups.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-3

 Log Shipping.

 Peer-to-Peer replication.

Deciding which SQL Server technology to implement depends on a number of factors, such as whether
the solution needs protecting at a server level, or at a database level. For server level protection, consider
Windows Server Failover Clustering. For database protection, Always On Availability Groups, log shipping,
and peer-to-peer replication provide different levels of protection. Agreeing requirements for how
database failures should be handled will influence the final decision on which technology will meet the
business requirements.
Defining standards for how the hardware and software features are to be configured improves
consistency, and can help with capacity planning for a given feature. For example, you could define that
the Quorum drive of a Windows Server Failover Cluster is identified as Q:\ drive, stored on a SAN, with a 1
GB capacity, or that all Windows Server Failover Cluster configurations must use dual power supplies, and
RAID 5 arrays.

Disaster Recovery

Disaster recovery is the process of recovering systems that have failed in line with the agreed restore time
objective (RTO) and restore point objective (RPO). The technologies that can be used to facilitate disaster
recovery include:
 Windows Server Failover Clustering.

 Always On Availability Groups.

 Log Shipping.
 Peer-to-Peer Replication.

 Backup and Restore.


As with availability, defining hardware and software standards can help to support the configured
solutions, including backing up databases and recovering failed services and data. This process should be
part of the business continuity planning document.

Maintainability
Maintaining the data and structures in SQL Server databases, and maintaining the operations of additional
SQL Server technologies, should be performed on a regular basis. This will ensure that the services
continue to provide data to users that is both functional and performant. Standards should be defined for
the following operations:

 Backup

 Testing restore operations


 Index maintenance

 Database consistency checks

In addition to the standard database maintenance, BI solutions require the data to be maintained
through:

 SSIS execution

 Cube processing

 Report generation

Typically, all of the preceding activities are scheduled to perform on a regular basis. Standards should be
defined and form the basis for standard operating procedures within the business.
MCT USE ONLY. STUDENT USE PROHIBITED
2-4 Configuring BI Components

Performance

Defining whether high performance or consistent performance is more important to the business will help
determine a consistent approach to configuring the BI supporting technologies.

Hardware Standards
The hardware on which the BI solution resides will
have an impact on the performance and the
operations of the solution. In an ideal world,
mission critical services would run on dedicated
hardware. However, many businesses have to
balance the needs for a service against the cost of
running dedicated servers. Only mission critical
servers that are proved to cause a monetary loss
should the service stop, are afforded the luxury of
a dedicated server.

As a result, standards should be considered and


defined for the hardware components that are
used. The type of SQL Server component that is used will be affected differently by the following
hardware components:

 Network
 Hard disk

 Memory

 CPU

The minimum specifications required for the version of Windows and SQL Server that is being used should
provide a starting point for configuring the hardware. In addition, there are other practices that should be
followed to optimize the hardware usage of the physical server.

Network

For BI solutions, you can use fast network cards for production servers. In addition, static IP addresses
should be configured on the server on which SQL Server resides. For high availability solutions, multiple
network cards should be installed to provide redundancy. For greater redundancy of network failures,
configure the network cards with multiple DNS and default gateway addresses.

For SQL Servers that deal with high volumes of data transfer across a network, which is typical of data
warehouse loads, you should enable jumbo packets on a network card. This will increase the volume of
data that can be handled by each network packet from the default 1,581 bytes up to 9,000 bytes. This
increase will reduce the overheads of handling traffic at the network layer, but all devices should support
this feature, including switches.

To enable jumbo packets on a network card, perform the following steps:

1. Open the Network and Sharing Center.

2. Click Change adapter settings.

3. Right-click the Network Card, and select Properties.

4. On the Networking tab, click Configure for the network adapter.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-5

5. Select the Advanced tab.

6. Select Jumbo Frame and change the value from disabled to the desired value.

Hard Disks

Hard disks have an impact on the performance of a BI solution, including data warehouse load times,
analysis services processing times, and report performance.
Hard disk specification has the biggest impact on a data warehouse. As this is stored in the database
engine, the following best practices apply:

 Place database data files and log files on separate drives.

 Place other system databases on a separate drive.

 Configure NTFS cluster size to 64 KB to align with the 64 KB extents used by SQL Server.

From this best practice, three separate volumes would be needed to meet these requirements. If this is not
possible, define standards based on the resources available, and set expectations for service availability
and performance.
For high availability, consider a RAID configuration to provide redundancy The following standard RAID
options are available:

 RAID 0 – Stripe set. Data is divided between all available disks with no redundancy.

 RAID 1 – Mirrored RAID array. The contents of one disk are duplicated on another.
 RAID 5 – Striped set with parity. Data is divided across all but one disk, which contains a parity bit.

 RAID 10 – Mirrored stripe set. Combines the performance benefits of RAID 0, with the redundancy
benefits of RAID 1.

Typically, databases are stored on a Storage Attached Network (SAN). SANs can be configured in different
ways; for example, they can be partly configured with traditional disk based storage, with the remainder
utilizing SSD technology. In fact, some SAN technologies can adaptively change the content that is stored
on an SSD or traditional storage—this is called adaptive optimization.

This might appear to provide benefits, but can cause problems. For example, a SAN that typically stores a
database on an SSD may drop the database to traditional storage because web developers are running
performance tests. Because of the high usage, the web server gets promoted to use the SSD at the
expense of the database. In this case, turning off adaptive optimization would be more desirable.

Memory

Make at least 2 GB of server memory available to the operating system, as SQL Server operates on top of
the operating system. Additional memory should be allocated for:

 Non SQL Server services, such as Internet Information Services (IIS).


 Each SQL Server component, such as SSAS or SSIS.

Consider the workloads of the applications running on the server. For example, most business users will
query approximately 20 percent of the data that is held in a data warehouse. As a result, if a data
warehouse is 50 GB in size, you would want to allocate 10 GB of memory as a starting point. For a true
picture of the workload, you can run Windows Performance Monitor, looking at the Process: Working
Set counter against the SQL Server process to identify how much memory is being used.
MCT USE ONLY. STUDENT USE PROHIBITED
2-6 Configuring BI Components

Alternatively, for an average view since the last SQL Server restart, you could run the following command:

Querying SQL Server memory usage with the sys.dm_os_process_memory DMV (dynamic
management view)
SELECT
(physical_memory_in_use_kb/1024) AS Memory_usedby_Sqlserver_MB,
(locked_page_allocations_kb/1024) AS Locked_pages_used_Sqlserver_MB
FROM sys.dm_os_process_memory;

This query returns a record of how much physical memory is being used, and how much memory is
locked. Locked memory will be reserved for SQL Server, and will not yield to the request from the
operating system to take this locked memory.

CPU

In the first instance, allow the CPU to be available to the operating system. This can be one CPU on
systems that contain up to four CPU cores, or two CPUs for systems that go beyond four CPU cores. As
with memory, there should be a balance of the CPUs across the applications that are running on a server.
The remaining decisions on CPU should account for:
 The minimum CPU required by non SQL Server applications; for example, Internet Information
Services (IIS).

 The minimum CPU requirements of each SQL Server component.

You can establish the current CPU usage using Windows Performance Monitor—specifically, Processer: %
Privileged Time, which measures the percentage of CPU that Windows is using, and Processor: % User
Time, which measures the percentage of CPU that other applications, such as SQL Server, are using in real
time. You can also use a range of dynamic management to look at CPU usage by query or by database, in
addition to other factors.

Physical vs. Virtual Servers

SQL Server does not distinguish between whether it is running on a physical server or a virtual server, so
there are the same considerations for networking, hard disks, memory, and CPU. One key point is that
multiple virtual servers can exist on the same physical host. It is important to ensure that the multiple
virtual machine hardware settings are not collectively higher than those that are available on a host. For
example, a physical server with 192 GB of RAM hosting four virtual servers with 64 GB of RAM allocated to
each virtual server. In this case, memory pressure could exist on all virtual servers.

Software Standard
The key to managing operational environments is
consistency. Defining standards ensures that
consistency is applied across all servers in all
environments. The following key practices should
be followed when considering software standards
that are to be supported by a BI operational team.

Define a version of Windows that represents


the corporate standard
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-7

The decision on which version of Windows to use may rest with an enterprise architect. It is important that
the BI operational team understand the corporate standards for using Windows and adhere to them. The
chosen version of Windows should also include the relevant Service Pack or Cumulative Update version.
All Windows servers should meet this corporate standard.

Define the version of SQL Server that represents the corporate standard
The BI architect should provide information to the BI operations team about which version of SQL Server
to use, including any Service Pack or Cumulative Update that should be installed on the SQL Server.

Define the edition of Windows and SQL Server for different environments

The edition of Windows and SQL Server to be used will depend upon the environment on which these
technologies are installed. For example, a nonproduction environment, such as a development
environment, might use Windows Server 2016 Standard Edition with SQL Server 2017 Developer Edition.
The main driver for making this decision is usually an attempt to reduce costs—the licensing cost for these
editions is lower than other editions, though the SQL Server feature set is the same as that in the SQL
Server 2017 Enterprise Edition.

Define antivirus standards for SQL Server files

Many organizations use antivirus software to provide protection for both the server and the desktops.
Antivirus software can slow down the performance of a computer that is running SQL Server. Antivirus
standards should be defined for SQL Server files and for how they are handled. If you have been given
approval by an organization’s security team, it is common for antivirus exceptions to be defined for the
following file types on a computer that is running SQL Server:

 *.mdf

 *.ndf

 *.ldf

 *.BAK

 *.TRN

 *.TRC

In addition, Windows Server Failover Clusters should not have antivirus software installed, because this is
known to cause issues with a cluster.

Define a list of supported third-party applications


The BI operations team should sanction a list of approved third-party applications for a given
environment with the organization. For example, the BIDSHelper application could be sanctioned for a
development and test environment, but would not be allowed on a production environment.

How to ensure compliance

At an individual server level, you can run the following command on a Windows server to ascertain which
version of Windows is running:

At an individual server level, you can run the following command on a Windows server to ascertain which
version of Windows is running:

How to determine the version of Windows


At the lower-left of the desktop, click the Start button and type:

winver
MCT USE ONLY. STUDENT USE PROHIBITED
2-8 Configuring BI Components

You can also run the following command in SQL Server Management Studio to establish which version of
SQL Server is running:
You can also run the following command in SQL Server Management Studio to establish which version of
SQL Server is running:

How to determine the version of SQL Server


select @@version

If you need to look at the version of an entire SQL Server and Windows estate, you can use the Microsoft
Assessment and Planning (MAP) toolkit. This can read the list of servers on a network, in addition to
providing information about the servers and installed Microsoft applications. For more information about
the MAP toolkit, see Microsoft Assessment and Planning (MAP) Toolkit in Microsoft TechNet :

Microsoft Assessment and Planning (MAP) Toolkit


http://aka.ms/C4z5xa

Meeting Availability and Disaster Recovery Requirement


A range of technologies can be used to meet the
high availability and disaster recovery
requirements of a business. Standards should be
defined when making the choice on the
technologies that are available.

Windows Server Failover Clustering

Windows Server Failover Clustering (WSFC) is the


high availability and disaster recovery option that
provides protection for the SQL Server instance.
This technology protects from server level failures
but is only as good as its weakest link. For
example, if you have provisioned the hardware to
use RAID 10, with dual network cards, but it only has a single power supply, the power supply carries a
high risk of server failure because there is no redundancy built into this particular hardware component.
The hardware and software standards that were presented in the previous topic become very important
when installing and configuring a WSFC. The support from Microsoft Product Support Service is only
available if:

 The hardware and software installed on the cluster is certified for the version of Windows being
installed.

 A fully configured failover cluster passes all required validation tests.

This policy ensures that the cluster is configured to a desired standard before product support is initiated.
It also means that the WSFC is at its most optimized for providing the high availability and disaster
recovery that it is designed to do. You should, therefore, consider the following best practices when
setting up any server—referred to as a node—which is part of a WSFC:
 Use the same certified hardware on all servers.

 Ensure that hardware redundancy is implemented.

 Ensure there is identical software configuration of all nodes in a cluster.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-9

 For the best performance, ensure all nodes are on the same network links.

 Do not install antivirus software.

 For SSIS, or if you run distributed transactions, install the Microsoft Distributed Transaction
Coordinator.

 Run the cluster validation wizard, as it will check many of the best practices required for a cluster.

Always On Availability Groups

Always On Availability Groups provides protection and recovery at a database level. There is a single
primary database and up to eight secondary databases. Data can be moved either synchronously or
asynchronously to a secondary replica—which can also be read from—or have administrative routines,
such as backup, performed on the secondary replica. A group of databases can be placed into an
availability set and can fail over as a single unit automatically, manually or by forcing a failover. An
availability set should reside on a WSFC to provide server protection as well. When using Always On
Availability Groups, use the following best practices:

 Availability groups are only supported on SQL Server Enterprise Edition.


 Ensure that a WSFC is set up under best practice.

 For best performance, use a dedicated network card for availability group traffic.

 All servers that are in an availability group must be of comparable specifications.


 Ensure that a secondary replica has the disk space to support databases that fail over to it from a
primary replica.

Log Shipping
Log shipping provides a low-cost solution to high availability that can support multiple partners in this
availability solution. The hardware requirements are not as strict as those for a WSFC—the failover also
has to be handled manually. In situations where a degree of downtime is deemed acceptable, log
shipping is a useful high availability solution. Use the following best practices when configuring log
shipping:

 Ensure the database recovery model is set to full.


 Ensure transaction log backups are taken regularly to reduce the backup size.

 Consider placing log shipping on the same network to mitigate against Internet failures.

Peer-to-Peer Replication

Peer-to-peer replication provides a high availability solution by maintaining copies of data across multiple
server instances. Built on transactional replication, peer-to-peer replication provides transactionally
consistent changes, in near real time, across multiple nodes. As data is maintained across the server in
near real time, peer-to-peer replication provides data redundancy, which increases the availability of data.
Like log shipping, the hardware requirements are relaxed, and the same considerations should be taken
into account.

Question: Does your organization apply any standard to your BI environments?


MCT USE ONLY. STUDENT USE PROHIBITED
2-10 Configuring BI Components

Lesson 2
Configuration Considerations for BI Technologies
With the range of technologies available with which to implement a BI solution, there are many ways that
each technology can be configured. Each technology has certain configurations that should be
implemented to either improve performance, maintainability, disaster recovery, or availability, as defined
by the business requirements. Standards should be defined with production servers in mind, and then
adopted across other environments. If there are existing servers, changes should be applied and tested to
bring these servers to a level that meets the BI operations standards.

Lesson Objectives
At the end of this lesson, you will be able to define key configuration standards for the following:

 Operating system

 Database Engine
 SQL Server Integration Services

 SQL Server Analysis Services

 SQL Server Reporting Services

 Data Quality Services and Master Data Services

Operating System Considerations


In many SQL Server solutions, the operating
system configuration is often overlooked in favor
of SQL Server configuration. A number of settings
can have a direct impact on the memory, CPU,
and hard disk utilization—this can improve the
performance of a computer that is running SQL
Server on the operating system. Some of the
changes are small, but others will have a big
impact. However, configuring these collectively
will contribute to a better performing server.
These changes are very simple to implement and
should be part of the BI operational standards. The
settings to include are:

Lock Pages in Memory

Lock pages in memory is a security setting that means you can define an account to keep data in memory
without the data being moved to virtual memory, when a Windows server experiences memory pressure.
You can use this in conjunction with the SQL Server instance option of Max Server Memory to define how
much memory is locked. This can improve the consistency of memory usage of SQL Server and enhance
performance. The SQL Server service account must be added to the lock pages in memory using the
following steps:

1. On the Windows desktop, at the lower-left, click the Start icon and type gpedit.msc.
2. On the Local Group Policy Editor console, expand Computer Configuration, and then expand
Windows Settings.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-11

3. Expand Security Settings, and then expand Local Policies.

4. Select the User Rights Assignment folder.

5. In the details pane, double-click Lock pages in memory.

6. In the Lock pages in memory Properties dialog box, click Add User or Group.
7. In the Select Users or Groups dialog box, add the SQL Server service account with privileges to run
sqlservr.exe, and then click OK.

8. Log out and then log back in for this change to take effect.

Perform Volume Maintenance

For certain SQL Server operations, disk performance can be improved through a process named instant
file initialization. When a database operation, such as creating a new database, is performed in SQL Server,
a process of zero initialization takes place. For example, if a new database is created that is 200 GB in size,
Windows will place a zero value in every byte of space within the 200 GB. The process of adding zeros can
be avoided if the SQL Server service account is added to the local security policy of Perform Volume
Maintenance Task, using the following steps:

1. On the Windows desktop, at the lower-left, click the Start icon and type gpedit.msc.

2. On the Local Group Policy Editor console, expand Computer Configuration, and then expand
Windows Settings.
3. Expand Security Settings, and then expand Local Policies.

4. Select the User Rights Assignment folder.

5. In the details pane, double-click Perform volume maintenance tasks.

6. In the Perform volume maintenance tasks Properties dialog box, click Add User or Group.

7. In the Select Users or Groups dialog box, add the SQL Server service account with privileges to run
sqlservr.exe, and then click OK.

8. Log out and then log back in for this change to take effect.

Setting Performance Options

You can adjust the performance options so that Windows can be configured to prioritize background
services over applications. You can also adjust the visual effects for best performance in the same area to
free up memory to the operating system. To modify these settings, you should perform the following
steps:

1. On the Windows desktop, at the lower-left, click the Start icon.

2. Type system and then press Enter.

3. In the All Control Panel Items window, click System.


4. In the System window, under Control Panel Home, click Advanced system settings.

5. In the System Properties dialog box, click the Advanced tab.

6. Under Performance, click the Settings button.


7. In the Performance Options dialog box, on the Visual Effects tab, click Adjust for best
performance.

8. On the Advanced tab, under Processor scheduling, click Background services, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
2-12 Configuring BI Components

Manage Windows Power Options

Power Options manages how your server uses power. The default option is balanced, but you can improve
SQL Server performance if you change the power option to High Performance—the CPU cycles used to
manage power is reduced. To set Windows Power Options, you should perform the following steps:

1. Click the Start button and type Control Panel, and then press Enter on your keyboard.

2. Click Hardware.

3. Click Power Options.

4. On the left side of the screen, select Create a power plan.

5. Select High performance and type a name for the power plan. Click Next.

6. Select how long the system's display should stay on before the computer goes to sleep. Click the
Create button to create the custom plan.

Database Engine
To affect stability and performance, a range of
configurations can be made to the database
engine. These can be considered on a case-by-
case basis. The operations team should use a
baseline standard for SQL Server instances and
adjust that accordingly with the supporting
documentation. You should also check that
existing servers in your SQL Server estate also
meet these standards. The baseline standards
should include:
Separating database and log files

As covered in the hardware standards, the


database and log files should be placed on separate drives. Data access to the database files is random,
and access to the log files is serial; therefore, combining these files on the same drive can increase
contention. Because data is written to a transaction log first, consider using a faster disk for the log file. If
availability or redundancy is an important consideration, use hardware based RAID array to provide
protection.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-13

It is common to see database and log files on the same drive; you can move a file by performing the
following steps:

Moving a transaction log file for the AdventureWorks database


-- Take the database offline
ALTER DATABASE [AdventureWorks]
SET OFFLINE;

--Modify the file using the logical filename and define a new location
ALTER DATABASE [AdventureWorks]
MODIFY FILE ( NAME = AdventureWorks_Data, FILENAME = 'D:\Data\AdventureWorks_Data.mdf' );

--Move the database log file to the new location in windows explorer

-- Place the database online


ALTER DATABASE [AdventureWorks] SET ONLINE;

Manage the tempdb database

Like any database, the tempdb database and log files should be placed on separate disks. In addition, you
should place the tempdb on the fastest disk possible. Many Transact-SQL and SQL Server activities can
cause intense usage of the tempdb database, including:

 Temporary table creation in queries.

 Sort operation in queries.


 Online index rebuilds.

In addition, contention of the tempdb can occur on the Page Free Space (PFS) page in the tempdb.
When temporary objects are created, one of the jobs of the PFS page is to identify pages that are free to
use. After an extent is allocated to a temporary object, a Global Allocation Map (GAM) page records what
extents have been allocated to data. Typically, there is one GAM page to track approximately 64,000
extents, or around 4 GB of data. There is also a Shared Global Allocation Map (SGAM) that denotes if an
extent is a uniform or a mixed extent—it also tracks approximately 64,000 extents, or around 4 GB of data.

To reduce the contention on the tempdb database, and if your server contains eight or less CPU cores,
you can create the same number of tempdb data files as you have cores. If you have more CPUs, you can
use eight data files as a guide to the maximum number of equal data files you should create. This will
create at least one PFS page per data file, so spreading the allocation contention and reducing pressure
on the tempdb database. These data files can reside on the same drive, because the intention is to create
more PFS pages. Creating more tempdb data files will likely be a false economy; however, if you wish to
use more data files, you can test the impact.
MCT USE ONLY. STUDENT USE PROHIBITED
2-14 Configuring BI Components

To create multiple tempdb data files, perform the following code:

Adding multiple tempdb data files


ALTER DATABASE [tempdb]
ADD FILE
(
NAME = [tempdata1],
FILENAME = 'E:\DATA\tempdata1.ndf',
SIZE = 1024000 KB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 100000 KB
) TO FILEGROUP [Primary];

ALTER DATABASE [tempdb]


ADD FILE
(
NAME = [tempdata2],
FILENAME = 'E:\DATA\tempdata2.ndf',
SIZE = 1024000 KB,
MAXSIZE = UNLIMITED,
FILEGROWTH = 100000 KB
) TO FILEGROUP [Primary];

Setting minimum and maximum memory settings

Memory settings can be used to determine the maximum amount of memory that SQL Server can use.
This can help balance memory management on servers that run multiple services, and also configure SQL
Server to leave memory available to the operating system. Used with the Lock Page in Memory security
policy, this can also fix the memory that SQL Server can use, including the minimum amount of memory.

To configure the memory setting within SQL Server, execute the following Transact-SQL:

Changing the memory setting in a SQL Server instance


-- turn on SQL Server advanced options
EXEC sys.sp_configure 'show advanced options', '1'
RECONFIGURE WITH OVERRIDE
GO

-- Set the minimum memory to 2GB, and maximum memory to 4GB


EXEC sys.sp_configure 'min server memory (MB)', '2048'
GO
EXEC sys.sp_configure 'max server memory (MB)', '4096'
GO
RECONFIGURE WITH OVERRIDE
GO

-- turn off SQL Server advanced options


EXEC sys.sp_configure 'show advanced options', '0'
RECONFIGURE WITH OVERRIDE
GO

Manage CPU settings


The CPUs that are used by SQL Server can also be configured to meet requirements, leaving CPUs
available to other processes running on the server. In addition, you can configure the property Max
Degree of Parallelism (MAXDOP). This determines the number of CPUs that will be used to run a Transact-
SQL statement in parallel. The default setting is 0—this means that all CPUs that are allocated to the SQL
Server can be used. You can change the value up to 32,767, and this setting can be overridden, using the
query hint MAXDOP = n.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-15

To configure the CPUs on a SQL Server instance perform the following steps:

Configuring CPUs on a SQL Server instance


--Configure the server to use CPU 0 and CPU 1
ALTER SERVER CONFIGURATION
SET PROCESS AFFINITY CPU = 0 TO 1
GO

-- turn on SQL Server advanced options


EXEC sys.sp_configure 'show advanced options', '1'
RECONFIGURE WITH OVERRIDE
GO

-- Set maximum degree of parallel to 2


EXEC sys.sp_configure 'max degree of parallelism', '2'
GO
RECONFIGURE WITH OVERRIDE
GO

-- turn off SQL Server advanced options


EXEC sys.sp_configure 'show advanced options', '0'
RECONFIGURE WITH OVERRIDE
GO

Optimize for ad hoc workloads


For SQL Server systems that have ad hoc SQL Server statements executed against them on a regular basis,
you can improve the memory management by using optimize for ad hoc workloads. When a query is
executed against a server, it will go through a five-stage process as follows:

1. Parse – checks that the syntax of a query is correct.


2. Resolve – the object in a query is resolved to an object ID.

3. Optimize – a good execution plan for retrieving the data is generated.

4. Compile – the plan is compiled into executable code.

5. Execute – the code is optimize.

Typically, in the optimize phase, the full execution plan is stored in memory for future reuse. When
optimize for ad hoc workloads is enabled, a smaller stub plan is stored in memory on the first execution—
it becomes a full plan on a subsequent execution. This ensures that one of the queries does not persist a
full plan in the execution cache found in memory, thereby reducing the memory used.

To enable optimize for ad hoc workloads, run the following code:

Enabling optimize for ad hoc workloads


-- turn on SQL Server advanced options
EXEC sys.sp_configure N'show advanced options', N'1'
RECONFIGURE WITH OVERRIDE
GO

--Enable optimize for ad hoc workloads


EXEC sys.sp_configure N'optimize for ad hoc workloads', N'1'
GO
RECONFIGURE WITH OVERRIDE
GO
-- turn off SQL Server advanced options
EXEC sys.sp_configure N'show advanced options', N'0'
RECONFIGURE WITH OVERRIDE
GO
MCT USE ONLY. STUDENT USE PROHIBITED
2-16 Configuring BI Components

Define standards for database recovery model

Taking backups is an important but routine task for the full protection and disaster recovery of data if the
database becomes offline. However, before this is done, it is important to decide which recovery model to
use. There are three options:

 Full

 Bulk_logged

 Simple

If you require full protection, and the best option for full recovery in a disaster, you should use the full
recovery model. The full recovery model logs all transactions to the transaction log of the database. You
should select this model if data recovery is an important business requirement. When this model is
selected and combined with a regular database and transaction log strategy, you can recover data to a
certain point in time.

The bulk_logged recovery model logs individual transactions such as inserts, updates, and delete
statements to the transaction log. Transact-SQL statements, such as SELECT INTO, bcp, OPENROWSET,
WRITETEXT, UPDATETEXT, and BULK INSERT, are minimally logged.

The simple recovery model provides the most straightforward form of backup and restore because it does
not support transaction log backups. Therefore, data loss can occur, because you can only restore a
database from the most recent backup.

To modify the recovery model in a database, perform the following step:

Changing a database recovery model to full.


USE [master]
GO
ALTER DATABASE [AdventureWorksDW]
SET RECOVERY FULL
GO

SQL Server Integration Services


From an operational perspective, SQL Server
Integration Services (SSIS) can be the most difficult
technology to manage. The configuration of the
technology itself is minimal, and many of the SSIS
packages that are developed can be highly
customized. Usually, the support of an SSIS
solution is arranged by fixing failed packages
during an SSIS execution. Therefore, SSIS logging
is very helpful in providing information to the
support team when they are trying to fix a failed
package—this is covered in a later module. You
can provide developers with guidelines that can
help them make use of hardware on the server, particularly regarding memory, where SSIS can have a big
impact. Consider promoting the following guidelines for the creation of SSIS packages:
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-17

Adjusting buffer size

SSIS processes data in memory known as buffers. Data flows through pipelines in the buffer as efficiently
as possible. Developers can influence the space used in memory by adjusting a number of SSIS properties
that control the amount of data that is placed in a buffer.

MaxBufferSize is a nonconfigurable property in an SSIS package that has a value of 100 MB. This means
that data in a buffer cannot exceed 100 MB per dataset—otherwise SSIS will create another buffer of 100
MB and split the data. This can have an impact on memory usage. For example, if 150 MB of data is
loaded in a dataset, SSIS will reduce the dataset and spread the data across multiple buffers. However, this
may mean that only approximately 75 MB of data is stored in each buffer—this will waste 25 MB per
buffer page. There is also an Estimated Row Size property that contains metadata about the
approximate size of data. As a result, you can adjust the following properties to influence the buffer size:

 DefaultMaxBufferSize. Set to 10 MB by default, the DefaultMaxBufferSize is limited to the


MaxBufferSize property of 100 MB.
 DefaultMaxBufferRows. By default, this is set to 10,000 rows, and is used with the Estimated Row
Size property to calculate the size of a dataset.

If you are aware of a dataset row size, in addition to the number of rows, these values can be adjusted to
maximize the memory usage of an SSIS data flow. This can be difficult to calculate with some tables in a
data warehouse, but for tables such as dimension tables that are relatively static, this can improve the
performance of the data load.

Promote parallelism within SSIS packages

Parallelism can occur in two ways within an SSIS package. An SSIS package execution can be designed in
such a way that it can execute multiple control flow at the same time. You can also use the
MaxConcurrentExecutables package property at the same time. The MaxConcurrentExecutables
property determines the number of threads that can be executed concurrently. The default setting of -1
provides a number of threads that is equal to the number of CPU cores on a server, plus two. For example,
on a server with four processors, this would equate to six MaxConcurrentExecutables. This setting can be
increased and will benefit CPU utilization of SSIS servers that run on dedicated servers.

Avoid the use blocking transformations

Advise SSIS developers not to use blocking transformations in SSIS data flow but instead look for
alternatives. Nonblocking transformations include:

 Aggregate

 Fuzzy Grouping
 Fuzzy Lookup

 Sort

These types of transformations create additional buffer space, introduce new threads when the
transformations are being executed, and can affect the performance of the SSIS server. For example,
instead of using a Sort transform to organize data, it is better to use an ORDER BY clause in a Transact-
SQL statement when retrieving data from a source system.

Create a package template

You can use package templates to include control flows, data flows, and associated properties that are
part of a standardized package for future reuse. This can help in maintaining agreed standards. To create
a package template, you create an SSIS package within Visual Studio. Add the common objects to the
package, then save and close Visual Studio. You should then copy the SSIS dtsx file created from the file
system to the following location:
MCT USE ONLY. STUDENT USE PROHIBITED
2-18 Configuring BI Components

C:\Program Files (x86)\Microsoft Visual Studio


12.0\Common7\IDE\PrivateAssemblies\ProjectItems\DataTransformationProject\DataTransformationItems
To reuse the package template, perform the following steps:

1. Open Visual Studio

2. Create a new SQL Server Integration Services project.

3. In Solution Explorer, right-click the SQL Server Integration Services project, point to Add and click
New Item.

4. In the Add New Item dialog box, double-click the package template created.

SQL Server Analysis Services


A range of platform optimizations can be
performed with Analysis Services in both
Multidimensional and Data Mining mode, in
addition to Tabular mode—this can ensure that
Analysis Services is operationally ready.
Recommendations to further improve the
performance of the cubes that are created can
also be made to developers.

Add an Analysis Services administrator at


setup

When installing Analysis Services, by default, no


administrator is defined. To ensure access to the
Analysis Services instance, add a user or group who has the rights to manage it.
Set the data directory location

You can define the location of the Analysis Services data in the instance properties of Analysis Services.
This provides the opportunity to control the storage location of the data that meets the needs of the
environment in which Analysis Services is installed. To change the data directory location, perform the
following steps:

1. Open SQL Server Management Studio.

2. In the Connect to Server dialog box, ensure the Server Type states Analysis Services, and specify a
server and credential to connect to the instance, and then click Connect.

3. In Object Explorer, right-click the instance name, and click Properties.

4. In the Analysis Services Properties dialog box, under Select a page, click General.

5. In the fourth row, that states DataDir, go to the second column and click the ellipsis (…) button.

6. In the Browse for a Remote Folder dialog box, browse to a folder location and click OK.

7. Click OK in the Analysis Services Properties dialog box.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-19

Tune Analysis Services memory usage

In the Analysis Services instance properties, in the General page, a number of options can enable you to
tune how Analysis Services manages the memory on the server. Set the memory options to account for
the operating system and other applications—the memory settings define how SSAS will manage
connections and memory when thresholds are reached or exceeded. The settings are expressed as a
percentage when a value of 0 to 100 is specified; a value above 100 changes the measurement to bytes,
and includes:

 TotalMemoryLimit. When the threshold define is reached—the default is 80 percent—Analysis


Services will start to deallocate memory to ensure there is enough space. The TotalMemoryLimit must
always be less than HardMemoryLimit.

 HardMemoryLimit. The default value is 0—this means that the memory usage of SSAS is a
percentage value between the total physical memory and the setting defined in TotalMemoryLimit.
When the threshold is reached or exceeded at this point, Analysis Services will terminate sessions
returning an error. The default value, zero (0), means the HardMemoryLimit will be set to a value
midway between TotalMemoryLimit and the total physical memory of the system.

 LowMemoryLimit. The default value is 65, which is 65 percent of physical memory, where SSAS
clears memory out of caches by closing expired sessions and unloading unused calculations.

 VertiPaqPagingPolicy. This applies to tabular server mode only, and specifies whether Tabular SSAS
should page memory to disk when there is memory pressure. A value of 0 prevents paging to disk; 1
is the default and enables paging.
 VertiPaqMemoryLimit. This applies to tabular server mode only. If paging to disk is allowed, this
property specifies the level of memory consumption (as a percentage of total memory) at which
paging starts. The default is 60.

 OLAP\AggregationMemoryLimitMax. This defines the maximum amount of memory that can be


devoted to aggregation processing. The default value is 80.
 OLAP\AggregationMemoryLimitMin. This defines the minimum amount of memory that can be
devoted to aggregation processing. The default value is 10.

 OLAP\BufferMemoryLimit. The default value is 60, which denotes the amount of memory that can
be used for processing cubes.
Use partitioning in Enterprise Edition

You should recommend that developers use cube partitioning in the Enterprise Edition of SQL Server.
Partitions can be used as storage containers for data and aggregations of a cube. Every cube contains one
or more partitions. For a cube with multiple partitions, each partition can be stored separately in a
different physical location. Each partition can be based on a different data source. Partitions are not visible
to users; the cube appears to be a single object.
MCT USE ONLY. STUDENT USE PROHIBITED
2-20 Configuring BI Components

SQL Server Reporting Services


When SQL Server Reporting Services is
experiencing heavy usage, you should use
platform management to ensure that the
hardware is utilized in an efficient way. In addition,
the BI operations team might be asked to
optimize the users’ browsing experience by using
Report Manager settings. You should implement
the following standards:

Manage the Reporting Services databases

Reporting Services uses two databases to manage


the system and store reports. The ReportServer
database is to Reporting Services what the Master
database is to SQL Server. The ReportServer database contains the metadata that runs, manages, and
secures the Reporting Services system—in addition, it can also hold report history. The
ReportServerTempDB is a volatile database that is used to hold cached datasets that are configured for
reports. Standard database practice should be followed when using these databases. In addition, on
heavily utilized servers, consider placing the ReportServerTempDB on a separate, faster hard disk.

Adjust memory usage

Memory settings can be specified to set a lower and upper limit on the amount of memory that is used by
Reporting Services, giving greater control over server resources. This includes:

 WorkingSetMaximum. This specifies the maximum amount of memory in kilobytes that is to be


used by the reporting server.

 WorkingSetMinimum. This specifies the minimum amount of memory in kilobytes that is to be used
by the reporting server.
These settings must be manually added to the rsreportserver.config file. In addition, thresholds can be set
that cause the report server to change how it prioritizes and processes requests—depending on whether it
is under low, medium, or heavy memory pressure. Configuration settings that control memory allocation
for the report server include:

 MemorySafetyMargin. This is a value of memory, specified in kilobytes, that determines the low
memory pressure band between the WorkingSetMinimum value and the MemorySafetyMargin value.

 MemoryThreshold. This is a value of memory, specified in kilobytes, that determines the medium
memory pressure band between the MemorySafetyMargin value and the MemoryThreshold value.
The high memory pressure band is between the MemoryThreshold value and the WorkingSetMaximum
value

Configuration. This involves setting the boundaries for each of the memory pressure bandings in the
RSReportServer.config file.
Use caching and snapshot reports

To optimize the user’s experience of browsing reports, the BI operations team can set the properties of a
report in the web portal to use caching or snapshot reports. Cached reports store a copy of the data
retrieved from a data source in the ReportServerTempDB database. Snapshots are stored in the
ReportServer database, and can also be used to retain a historical snapshot of the data.
The BI operations team should define the standards for which reports would make use of a cache—and
which reports would make use of snapshots. The factors that would influence the standards include:
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-21

 The recurring nature of the report.

 Whether or not a report’s history should be retained.

 The impact on parameter usage in a report.

Scheduling report execution

Report execution can be scheduled in the following ways:

 Report snapshot generation.

 Report cache refresh schedule.

 Report subscriptions.

Reporting Services uses the SQL Server Agent to manage the schedule for the execution of the reports,
and the BI operations team will define a timetable for when report execution schedules should take place.
You could spread out the time the scheduled executions take place. It is common for many report
executions to occur overnight at the same times during the schedule, causing the server to be under
unnecessary pressure. Organizing the schedule to spread the load of executions will benefit the server.

Data Quality Services and Master Data Services


Master Data Services and Data Quality Services
enable enterprise information management within
an organization and can be integrated within SQL
Server control flows and data flows to process and
cleanse data, and retrieve master data like any
other source date. The following practices should
be employed and considered as adopted
standards:

Use standard data best practice

Master Data Services uses a database to store the


metadata to run the service, in addition to storing
data for master data. The name of the database
can be configured, and the default name is MDS. Data Quality Services creates three databases during the
installation—configuration is as follows:

 DQS_MAIN

 DQS_PROJECTS

 DQS_STAGING _DATA
All these databases should be implemented following database best practice, such as separating data and
log files onto separate disks and performing regular backups.

Integration requires the same server

Data Quality Services can integrate well with Master Data Services but, for this to occur, both services
must be running on the same instances of SQL Server—and on the same server. To configure integration,
you should perform the following steps:
1. Open Master Data Services Configuration Manager.

2. Configure the Master Data Services options first.


MCT USE ONLY. STUDENT USE PROHIBITED
2-22 Configuring BI Components

3. On the Web Integration page, click Enable Integration with Data Quality Services.

Question: Which area of the BI technology stack will you explore in more depth when you
get back to your organization?
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-23

Lesson 3
BI Architectures
SQL Server BI solutions are supported on a wide range of platform architectures that are designed to
meet the business requirements. The BI operations team is highly likely to be supporting a wide range of
platform architectures across the given environments. By using the configuration standards outlined in the
previous topic, different architectures can be implemented to meet the availability, disaster recovery,
maintenance, and performance needs of a solution.

Lesson Objectives
At the end of this lesson, you will be able to define common architecture and the appropriate
configuration standards for:

 Stand-alone architectures

 Isolated services architectures


 High availability architectures

 Disaster recovery architectures

Stand-alone Architectures
Stand-alone architectures refer to SQL Server
installations that are performed on a single
server—and there are multiple services running on
the same physical or virtual servers. Scenarios in
which stand-alone architectures are found include:

 Nonproduction environments such as


development and testing environments.

 Production servers where budgets are a


constraint.

 Production servers where the services are not


seen as mission critical.

 Production servers that have no high availability requirements defined.

In stand-alone environments, it is likely that compromise will be required in the configuration of the
various servers. However, certain configuration standards should not be compromised because they could
prevent access to a service; negatively affect the performance of the server; or compromise the
recoverability of the service. These standards can include:

 Implementing operating system standards. The operating system setting can aid the performance
of the server. One setting that may be compromised is Lock pages in memory—this should be
considered on a case-by-case basis.

 Balancing memory and CPU utilization across services. It is important to ensure that all services
running on the single server have enough memory and CPU allocated to them to operate efficiently.

 Defining database recovery models. Database recovery models affect the level of backup and
restore capabilities. For any database that needs to be restored to a point in time, you should ensure
that the recovery model is set to full.
MCT USE ONLY. STUDENT USE PROHIBITED
2-24 Configuring BI Components

 Adding an Analysis Services administrator account. An Analysis Services instance that does not
have an administrator defined will be inaccessible—in this case, access must be granted by the built-
in administrators.

The area of compromise is the management of the databases and database files. Typically, stand-alone
architectures may not be endowed with many hard disks. As a result, databases and their associated files
might have to be collocated on the same volumes. In this scenario, the following databases and database
files should be managed in the following priority:

1. tempdb data files should be placed on a separate volume.

2. Database data and log files should be placed in separate volumes.

3. System databases should be placed on a separate volume.

4. SQL Server installation files should be placed on a separate drive.

Isolated Services Architecture


Isolated services architecture refers to SQL Server
installations that only contain a dedicated service
installed on a physical or virtual server. Scenarios
in which an isolated services architecture is found
include:

 Nonproduction servers, such as a test or UAT


server.

 Production servers where isolation is required


with high availability needs.
In this scenario, all configuration standards should
be applied and checked by the BI operations
team. The standards will vary, based on the service that is installed—they should meet the agreed
business requirements. For example, if a server contains a single SQL Server instance, the following
standards should be applied:

1. Operating system standards:

o Lock pages in memory.


o Perform volume maintenance.

o Prioritize for background services.

o Manage power options.

2. Database engine standards:

o Separate database and log files.

o Manage the tempdb database.

o Set minimum and maximum memory settings.

o Manage CPU settings.

o Optimize for ad hoc workloads.


o Define standards for database recovery model.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-25

3. Any additional agreed standards.

The third area covers any additional settings that are agreed, such as antivirus software configuration and
third-party application configuration requirements.

High Availability Architectures


High availability architectures refers to SQL Server
installations that have added hardware and
services for redundancy in a given solution.
Scenarios in which high availability architecture
are found include:
 Production servers that are seen as mission
critical.

 Nonproduction environments, such as a


preproduction environment.
It is rare to see this type of architecture in a
nonproduction environment, although some
organizations must replicate the same setup as a production server for final sign-off, so that deployed
changes can meet compliance or legislative requirements.

In the scenario on the slide, a Reporting Services instance is set up in native mode in a high availability
architecture. The following standards should be adopted:

On all servers:

1. Operating System standards:

o Lock pages in memory.


o Perform volume maintenance.

o Prioritize for background services.

o Manage power options.


On the Windows Server Failover Cluster servers:

1. Servers involved in the WSFC must pass the cluster validation test.

2. Database engine standards:

o Separate database and log files.

o Manage the tempdb database.

o Set minimum and maximum memory settings.

o Manage CPU settings.

o Optimize for ad hoc workloads.

o Define standards for database recovery model.

On servers that host the Reporting Services applications:

 Memory usage should be adjusted for Reporting Services.


MCT USE ONLY. STUDENT USE PROHIBITED
2-26 Configuring BI Components

Network Load Balancer (NLB) requirements

1. The Reporting Services host name and IP address of the NLB should be registered in DNS.

2. The IP addresses of the SSRS1 and SSRS2 should be enlisted in the MLN to accept round robin
requests for access to the report server.

For more information about high availability for Reporting Services, see:
http://aka.ms/Psec4f

Incorporating Disaster Recovery


Backup should always be used as a method for
recovering from a disaster and should form part of
the standard operating procedures for a BI
operations team. In addition, the high availability
Reporting Services architecture can be extended
to use availability groups and create secondary
replicas on an on-premises server—or, as the slide
shows, on an Azure virtual machine in the cloud.
This approach can be used together with the
backup strategy as a method to recover from an
SSRS database failure. The connection string
properties can be configured to connect to the
primary WSFC, with a failover partner set to the Azure virtual machine if the primary server does not
respond.

As a result, additional standards should be applied to the Azure virtual machine, including:

1. Operating system standards:

o Lock pages in memory.


o Perform volume maintenance.

o Prioritize for background services.

o Manage power options.


2. Database engine standards:

o Separate database and log files.

o Manage the tempdb database.

o Set minimum and maximum memory settings.

o Manage CPU settings.

o Optimize for ad hoc workloads.

o Define standards for database recovery model.

In addition, Azure Infrastructure as a Service (IaaS) components will have to be configured to allow
network communication from the on-premises servers to the server in the cloud, including:

1. A dedicated connection such as ExpressRoute, or a VPN tunnel for traffic over the Internet.

2. The creation of a virtual network that will host the virtual machine.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-27

3. Enabling a firewall setting on the Azure virtual machine to allow remote Reporting Services requests.

If a failover occurs, you should complete the following steps before resuming operations:

1. Stop the instance of the SQL Agent service that was being used by the primary database engine
hosting the Reporting Services databases.

2. Start SQL Agent service on the computer that is the new primary replica.

3. Stop the Report Server service.

4. If the report server is in native mode, stop the Report Server Windows server by using Reporting
Services configuration manager.
5. If the report server is configured for SharePoint mode, stop the Reporting Services shared service in
SharePoint Central Administration.

6. Start the Report Server service or Reporting Services SharePoint service.


7. Verify that reports can run against the new primary replica.

For more information about Reporting Services and availability groups, see:
http://aka.ms/Eja06d

Question: Whilst isolated services would be a desired architecture for many BI production
scenarios, how do you manage balancing services on a stand-alone architecture setup? Do
you see any options that have just been presented that could help your current situation?
MCT USE ONLY. STUDENT USE PROHIBITED
2-28 Configuring BI Components

Lesson 4
SharePoint BI Environments
Some organizations choose to surface the presentation layer of their BI solution within a SharePoint
solution. SharePoint enables the sharing and collaboration of BI assets across the business in a controlled
and secure manner. It can integrate with SQL Server technologies such as Reporting Services, and other
presentation technologies, including PowerPivot and PerformancePoint. When supporting a SharePoint BI
solution, standards are equally important to enable a BI operations team to support the solution in an
effective manner.

Lesson Objectives
At the end of this module, you will understand the SharePoint considerations for:

 Hardware

 Single farm environments


 High availability environments

 Key configuration considerations

Hardware Considerations
SharePoint Services is a multitier architecture that
enables the sharing and collaboration of business
documents. There are three layers to the
SharePoint Server architecture:

Web service tier


This layer includes one or more servers that accept
web requests for SharePoint applications.

Application layer
This layer includes applications such as Reporting
Services, PowerPivot, and PerformancePoint.

Database layer

This layer hosts the supporting SharePoint and application databases.

The BI operations team might be required to support the solution at both the database and application
layers. Meeting the minimum standards and accounting for capacity based on the databases used will
ensure that the SharePoint server is not put under resource contention. From a database layer perspective,
standard operating system and database practice should be employed. This will include databases related
to both SharePoint and application databases, including:

 SharePoint_Config

 SharePoint_AdminContent_<GUID>

 WSS_Content

 AppManagement

 Bdc_Service_DB_<GUID>
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-29

 Search_Service_Application_DB_<GUID>

 Search_Service_Application_AnalyticsReportingStoreDB_<GUID>

 Search_Service_Application_CrawlStoreDB_<GUID>

 Secure_Store_Service_DB_<GUID>

 SharePoint_Logging

 User Profile Service Application_ProfileDB_<GUID>

 DefaultPowerPivotServiceApplicationDB_<GUID>

 PerformancePoint Service _<GUID>

 ReportingService_<GUID>

 ReportingService_<GUID>_TempDB

The minimum hardware requirements for the database server are:


Processor

4 cores, 64-bit platform for fewer than 1,000 users.

8 cores, 64-bit platform for between 1,000 and 10,000 users.


Memory

8 GB for fewer than 1,000 users.

16 GB for between 1,000 and 10,000 users.


Hard disk

80 GB minimum.

Single Farm Environments


In a single server setup, the SharePoint server
architecture tiers are hosted on a single Windows
server. This type of configuration is found in a
nonproduction environment, such as a
development environment, or in small production
environments. Because all of the three tiers run on
the same Windows server and share the same
hardware, there can be increased contention of
resources, although the main benefit of this model
is that the licensing cost for the solution is
minimized.

To support BI applications on a SharePoint server,


the BI operations team might be required to support one or all of the following services:

Excel Services
Excel Services enables the sharing of Excel® files and is a prerequisite service for Power Pivot for
SharePoint. Excel Services is configured in the application layer.
MCT USE ONLY. STUDENT USE PROHIBITED
2-30 Configuring BI Components

PowerPivot for SharePoint

Power Pivot for SharePoint enables Power Pivot functionality with a Power Pivot instance of Analysis
Services, created in SharePoint Server by using SQL Server setup.

PerformancePoint Services

PerformancePoint Services is installed as part of SharePoint setup on the application servers, on the
SharePoint farm. SharePoint Central Administration can be used by the BI operations team to complete
the configuration of PerformancePoint and other BI services hosted within SharePoint Server.

High Availability Environments


In a high availability setup, the SharePoint
architecture tiers are separated across Windows
servers, with another server provided at each tier
to offer redundancy. Multiple servers can be
added to each tier, in addition to providing a
scale-out architecture to distribute the workloads
across multiple servers. In this environment,
additional infrastructure is set up, such as
configuring Network Load Balancer and managing
Domain Name Services (DNS). The architecture still
provides the BI services of Reporting Services,
Power Pivot and PerformancePoint in an available
way. However, there are additional SharePoint services that might have to be supported by the BI
operations team, including:

Claims to Windows Token Service

This service is required when, for example, Excel Services has to communicate with a remote data source
that is hosted outside of the SharePoint farm. For a BI implementation of a SharePoint farm, the Claims to
Windows Token Service must be configured on the same server on which Excel Services is installed.

Secure Store Service


The Secure Store Service provides an alternative method of authentication, but the Secure Store Service is
available to a wider range of applications. The Secure Store Service greatly simplifies the configuration of
authentication for many services, including PerformancePoint Services and PowerPivot.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-31

Lab: Configuring BI Components


Scenario
Adventure Works Cycles is a global corporation that manufactures and sells bicycles and accessories. The
company sells through an international network of resellers, and has a direct sales channel through an e-
commerce website.

Adventure Works employees are increasingly frustrated by the time it takes for the business reports to
become available on a daily basis. The existing managed BI infrastructure, including a data warehouse,
enterprise data models, and reports and dashboards, are valued sources of decision-making information.
However, users are increasingly finding that it takes too long for the data to be processed in the overnight
load, resulting in reports not arriving to business users until the afternoon.

Objectives
After completing this lab, you will be able to:

 Identify an area where the application configuration can be standardized.


 Configure BI applications.

Estimated Time: 45 minutes

Virtual machine: 10988C-MIA-SQL


User name: ADVENTUREWORKS\Student

Password: Pa55w.rd

Exercise 1: Standardizing the Data Platform


Scenario
Adventure Works has identified that there must be more standardization of the data platform applications
as a first step to stabilizing the current solution that they have in place. You have been tasked with
identifying the configuration that should be standardized on the MIA-SQL server.

The main tasks for this exercise are as follows:


1. Prepare the Lab Environment

2. Review the MIA-SQL Server Configuration

 Task 1: Prepare the Lab Environment


1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. Run Setup.cmd in the D:\Labfiles\Lab02\Starter folder as Administrator.

 Task 2: Review the MIA-SQL Server Configuration


1. Collaborate with two or three other students.
2. Use the PlatformReview.docx in the D:\labfiles\Lab02\Starter folder as a framework to identify
the configuration changes to be made to the MIA-SQL Server.

3. Close WordPad.
MCT USE ONLY. STUDENT USE PROHIBITED
2-32 Configuring BI Components

Results: At the end of this exercise, you should have created a table that shows which areas of the data
platform should be standardized, including:

The operating system.

The MIA-SQL database engine instance.


The MIA-SQL Analysis Services instance.

The MIA-SQL Reporting Services instance.

Exercise 2: Configuring the Operating System


Scenario
Recommendations for configuration changes to the operating system have been approved by the change
board. As a member of the BI operations team, you are to implement the changes on the MIA-SQL
operating system and validate that each change has been successfully configured.

The main tasks for this exercise are as follows:

1. Setting the Performance Options.

2. Set Lock Pages in Memory


3. Set Perform Volume Maintenance Settings

 Task 1: Setting the Performance Options.


1. Open System properties.

2. Set the performance options for background services.

 Task 2: Set Lock Pages in Memory


1. Open Local Group Policy.

2. Set the Lock pages in memory for the SQL Server Database Engine service account.

 Task 3: Set Perform Volume Maintenance Settings


 In Local Group Policy, set the SQL Server Database Engine Service Account to Perform Volume
Maintenance.

Results: At the end of this exercise, you will have:

Set performance options.

Set lock pages in memory.

Set Perform Volume maintenance.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-33

Exercise 3: Configuring the Database Engine


Scenario
Some of the recommended configuration changes to the database engine have been approved by the
CTO as the SQL Server service requires a restart. As a member of the BI operations team, you are to
implement the changes on the MIA-SQL database engine instance and validate that each change has
been successfully configured.

The main tasks for this exercise are as follows:

1. Modifying SQL Server Memory

2. Optimizing for Ad Hoc Workloads

3. Moving tempdb Data Files

4. Moving tempdb Log Files

 Task 1: Modifying SQL Server Memory


1. Start SQL Server Management Studio and connect to the MIA-SQL instance of the SQL Server
database engine by using Windows authentication.

2. Open a new query window.

3. Write a query that sets the instance minimum memory to 2 GB.

4. Write a query that sets the instance maximum memory to 4 GB.

5. Execute the query.

 Task 2: Optimizing for Ad Hoc Workloads


1. Open a new query window.

2. Write a query to set the instance level property to optimize for ad hoc workloads.
3. Execute the query.

 Task 3: Moving tempdb Data Files


1. View the properties of the tempdb database.

2. Move the tempdb data files to the G:\Microsoft SQL Server\MSSQLSERVER\Data folder.

 Task 4: Moving tempdb Log Files


1. Move the tempdb log files to the F:\Microsoft SQL Server\MSSQLSERVER\Logs folder.

2. Restart the MIA-SQL Server instance.

3. Validate the changes made to the tempdb.

4. Validate the changes made for Optimize for Ad hoc Workloads.

5. Validate the changes made for SQL Server memory.

6. Close SQL Server Management Studio, without saving any changes.


MCT USE ONLY. STUDENT USE PROHIBITED
2-34 Configuring BI Components

Results: At the end of this exercise, you will have:

Modified the SQL Server memory.

Configured the MIA-SQL database instance to be optimized for ad hoc workloads.


Moved tempdb data files to the G:\.

Moved the tempdb log file to the F:\.

Exercise 4: Configuring Reporting Services


Scenario
Some time has passed since the configuration of the operating system, the database engine, and Analysis
Services. It has been identified that Reporting Services is consuming memory to the detriment of these
services. You have been authorized by the change board to modify the Reporting Service memory so that
it uses a maximum of 3 GB and a minimum of 2 GB of memory, so it does not affect the server.

The main tasks for this exercise are as follows:

1. Modifying the Reporting Services Memory

 Task 1: Modifying the Reporting Services Memory


 Configure the memory of Reporting Services to a maximum of 3 GB and a minimum of 2 GB.

Results: At the end of this exercise, you will have:

Modified the memory setting for Reporting Services.

Question: Are there any additional changes you would have made to the server that was
configured?

Question: What configuration setting would you adopt as standard—and why?


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-35

Module Review and Takeaways


In this module, you have learned that the configuration of the BI components within the SQL Server
product stack represents an opportunity to have the greatest impact on the stability and performance of
the overall BI solution. It also enables the BI operations team to rule out the data platform as a root cause
of issues that occur in a given environment. To that end, you have learned:

 The importance of standardized builds.

 The configuration considerations for BI technologies.

 The BI architectures available.

 The considerations for SharePoint BI environments.

Real-world Issues and Scenarios


Common scenarios that are observed on BI servers in the real world include:

 The overprovisioning of multiple virtual servers on a single host server.

 Incorrectly configured memory and CPU settings for multiple services on a single server.
 Data and log files stored on the same drive.

 The incorrect management of tempdb data files.

 Local security policy settings of an operating system not being applied to servers.
MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
3-1

Module 3
Managing Business Intelligence Security
Contents:
Module Overview 3-1 
Lesson 1: Security Approach to BI Solutions 3-2 

Lesson 2: Security Components 3-8 

Lesson 3: Security Approach for BI Components 3-14 


Lesson 4: The Security Approach in Different BI Environments 3-20 

Lab: Managing Business Intelligence Security 3-22 

Module Review and Takeaways 3-26 

Module Overview
Managing the security of data within an organization should be the first priority for any operational team.
Not only could the ramifications of a data leak lead to commercial losses, but there may also be legal and
financial penalties that could have wider implications for the business.
It is very important that the business intelligence (BI) operations team takes a holistic approach to
securing the data. Considerations should include the physical security of the data, in addition to
protection at an operating system or SQL Server® level. Transfer of data to other sites and data at rest
may have to be protected. In these cases, encryption components come into play. Meeting compliance
requirements may force a business to track activity on SQL Servers, or provide access to data by using
auditing.

Objectives
At the end of this module, you will be able to:

 Describe the security approach to a BI solution.

 Understand the security components available.

 Apply the security components to BI technologies.

 Manage security in different environments.


MCT USE ONLY. STUDENT USE PROHIBITED
3-2 Managing Business Intelligence Security

Lesson 1
Security Approach to BI Solutions
The approach taken by the BI operational team should fall in line with any security policies that have been
defined within the organization. This approach should apply throughout the entire technology stack, and
consideration should also be extended to include nontechnology security issues. This should lead to a
culture of security in depth, with the aim of mitigating against any potential weaknesses that could expose
unauthorized access to the data.

Lesson Objectives
After completing this lesson, you will be able to:

 Describe a security approach.

 Understand the SQL Server security model.

 Track activity with auditing.

Describe the Security Approach


SQL Server and Windows® technologies are
certified to Common Criteria Compliance,
enabling administrators to achieve industry
compliance in a range of areas, such as PCI and
HIPAA. You should not rely on technology alone
to meet compliance requirements, because
process can also play an important part in
achieving the security objectives within an
organization. When addressing security, the
following areas should be considered within your
overall approach:

Physical security

Physical access to the servers that host the data should be controlled. Many organizations that hold
sensitive data will host servers within a locked room where access is controlled. Other organizations will
store data within remote data centers that are hosted by a managed service provider, such as Microsoft®
Azure™. These data centers operate a very strict policy to control who can gain access to their locations.
For auditing purposes, key cards are typically used to log the individuals who have accessed a server room
or data center.

People

There are examples of breaches in security that have occurred because of social engineering—where
someone elicits information from another individual to gain unauthorized access, either to data or data
centers. It is important that the organization breeds a culture of defense in depth, where it is deemed
appropriate for an employee to question the actions of an individual when security is involved. Employees
should feel comfortable with this situation.

The BI operations team should evaluate security requests on a case-by-case basis. There is a difference
between what a user wants, and what a user needs. Users may request elevated privileges out of
convenience, or to bypass a process. If a request is thought to be inappropriate, the BI operations team
should reject it and give reasons for this course of action.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-3

Update maintenance

Windows and SQL Server updates should be maintained on all servers. Service Packs and Cumulative
Updates will contain security updates that are designed to reduce security vulnerabilities. It is important to
provide a nonproduction environment that enables the testing and impact analysis of installing an update
before it is installed within a production environment. It is equally important to inform the business when
SQL Server falls out of scope for Microsoft support—a couple of years’ notice is normally given but, after
this time, there is no guarantee that further updates will be available to help protect against future
vulnerabilities.

Surface area reduction

Surface area reduction involves installing only the technology that is required for a service to be provided
to a business. This may form part of a standardized SQL Server build, a subject covered in Module 2 of this
course. Removing or disabling unused services and applications from a server reduces the opportunity for
an attacker to interrogate SQL Server or Windows components.

Utilizing Active Directory

When the authentication model for SQL Server is set to Windows authentication or mixed mode
authentication, you can use domain objects to control access to a SQL Server. Active Directory® is a
directory services database that holds information about objects including computers, users, and groups.
Objects can be managed and secured through a central management console. The network team will
already be managing security access to users through group management. The range of groups available
includes:
 Domain local groups. These are used to define access to resources such as a SQL Server login, and
typically contain users and other groups as members.
 Global groups. These are used to group together a collection of users and other global groups from
the same domain, to represent people who work in the same function or department. For example, a
global group named Accounts may contain users or other global groups named Accounts Payable
and Accounts Receivable. All users who are part of all of the groups will be members of the Accounts
global group.

 Universal groups. These are used to group together a collection of users and global groups from
other domains to represent people who work in the same function or department. Universal groups
operate in a similar way to global groups, but their scope extends beyond the domain in which they
belong.

Active Directory administrators will typically assign users to a global group. This group is then placed in a
domain local group to which permissions are assigned. This approach is referred to as the A, G, DL, P
approach and may be used within the business. The BI operations team should be aware of the available
groups and use them as much as possible to ensure a compliant and coherent security approach to
managing access to a SQL Server.

Encryption

Encryption is the process of converting data into a cipher text format so that it’s unreadable to
unauthorized personnel. Data encryption can be performed by the SQL Server operating system, or by
applications, and can be used to provide an additional level of protection in the following areas:

 Data transfer over a public network such as the Internet.


 Database encryption for physical disks removed from a server.

 Column level protection to hide sensitive data, such as credit card information, from internal users.
MCT USE ONLY. STUDENT USE PROHIBITED
3-4 Managing Business Intelligence Security

The BI operations team should consider all of the preceding areas when they are looking to provide a
secure, defense-in-depth approach to the organization’s BI assets. This will involve working with other
teams, such as Active Directory and network administrators, database administrators, and an
organization’s security team, to devise a comprehensive security approach that meets the needs of the
business.

SQL Server Security Model


Each of the BI technologies has its own security
models and considerations that should be utilized
by the BI operations team to provide protection
against unauthorized access to the data.
Database Engine

The SQL Server Database Engine hosts the data


warehouse and provides the capability to apply
security at different levels to enable a defense-in-
depth security approach. It can be protected at
the following levels:
 Server

 Database

 Schema

 Object

From a database engine perspective, the common activities involve managing SQL Server logins and
database users. In addition, there is the ability to manage encryption technologies to associate with logins
or database users, or to encrypt the data that resides within SQL Server. The Always Encrypted feature
ensures that data remains encrypted when a high privileged account, such as a database administrator, is
administering the data—so they should not be able to see the data. Furthermore, you can use Row Level
Security to control access to rows in a database table, based on the characteristics of the user executing a
query.

Reporting Services

Access to SQL Server Reporting Services (SSRS) is defined at both the system level and the content level—
access should be controlled by either Windows users or groups, or SQL Server logins. You should also
consider the application protocol that is used to access the report server. For secure data, there is the
option to use HTTPS, or the business might want to use the default HTTP protocol. Finally, settings can be
configured in the RSReportServer.config file and used to mitigate against threats, such as man-in-the-
middle attacks.

Analysis Services

SQL Server Analysis Services (SSAS) uses Windows authentication as the basis of its security model to
authenticate users. After authentication has taken place, Windows users can be made a member of an
Analysis Services role. Permissions can then be assigned to the role to control access to an online
analytical processing (OLAP) database, data sources, a cube, dimensions, dimension members, and specific
cells within the cube. This security model can also be used to secure a data mining structure and data
mining models in traditional Analysis Services.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-5

Integration Services

Integration Services provides a number of security layers to protect the packages that contain the logic to
move data. Package properties and digital signatures can be used to protect all types of SQL Server
Integration Services (SSIS) packages. SQL Server database roles can be used to protect packages that are
deployed to SQL Server; operating system permissions can be used to protect packages that are stored in
the file system.

Master Data Services

Security can be applied within Master Data Services (MDS) to control access to the data that is stored
within the entities in an MDS model. You can also use the Super User permission to assign administrative
permissions that enable users to create subscription views, perform version management, or manage the
security of an MDS model. Like Analysis Services, the security of MDS is based on local or Active Directory
domain users and groups.

Data Quality Services

Data Quality Services (DQS) security is based upon the SQL Server security model, and is managed within
SQL Server Management Studio. SQL logins are added as users in the DQS_MAIN database, and associate
each user with one of the DQS roles to define the permissions.
In the first instance, the BI operations team should be looking to use some or all of these areas to control
access to the data that is held in the BI infrastructure, through authentication and authorization. For
highly sensitive data, encryption technologies and network protocols should be considered to provide an
additional layer of protection for the data. The key is to keep the approach as simple as possible and try
to make use of Windows security as much as possible, whilst meeting business security requirements.

Tracking Activity with Auditing


The BI operations team should define the level of
auditing that will be performed. As a minimum,
legislative and compliance requirements must be
met by the auditing approach, after which the
team must consider, on a case-by case-basis,
which BI components should fall under the scope
of auditing, and the level of auditing that should
be performed. Auditing can be a resource
intensive operation, so there should also be a
metric for defining when auditing can be stopped.

SQL Server Database Engine


Within the SQL Server Database Engine there are a
number of options for auditing access to the server or to a database.

Common Criteria includes the ability to view login statistics. You can enable this and other settings by
enabling the common criteria compliance server configuration option.
MCT USE ONLY. STUDENT USE PROHIBITED
3-6 Managing Business Intelligence Security

To configure common criteria compliance on a SQL Server instance, execute the following code:

Enabling common criteria compliance


sp_configure 'show advanced options', 1 ;
GO
RECONFIGURE ;
GO
sp_configure 'common criteria compliance enabled', 1 ;
GO
RECONFIGURE ;
GO

SQL Server Audit provides the tools and processes that enable store and view audits on various server and
database objects. Several components work together to audit a specific group of server or database
actions.

The SQL Server Audit object is the component that is responsible for collecting a single instance of the
server or database actions that should be monitored. After this is determined, you should define one
server audit specification per audit that is held in the SQL Server Audit object. The server audit
specification collects many server-level audit action groups raised by the Extended Events feature—this is
a general event-handling system for server systems.
Alternatively, you can define a database audit specification per database that is also stored in the SQL
Server Audit object. The database audit specification collects many database-level audit action groups
raised by the Extended Events feature.
Audit action groups are predefined and include the events exposed by the database engine. Audit action
groups can be defined for both the server level and the database level. These actions are sent to the audit,
which records them in the target.

Reporting Services

When Reporting Services is installed, report execution logging is enabled by default and retains 60 days of
execution information within the ReportServer database, in a table named dbo.ExecutionLogStorage. This
information is exposed in a number of views that are also created by default and named
dbo.ExecutionLog, dbo.ExecutionLog2 and dbo.ExecutionLog3. The data in these views varies slightly but
contains information that includes user name, ReportID and execution time. This can provide the BI
operations team with information regarding who has accessed which report. Not only can this help from
an audit perspective, but it can also provide evidence about which reports are or are not being used.

To change the retention duration of the logs, perform the following steps:

Changing the retention period for Reporting Services logs


Open SQL Server Management Studio.
In the Server Type, select Reporting Services, specify the Server name and Authentication
that will be used, and then click Connect.
Right-click the Reporting Services instance name and click Properties.
In the Server Properties page, under Select a page, click Logging.
In the box next to Remove log entries older than this number of days, enter the number of
days you want to keep the logs for, and then click OK.

Analysis Services

Analysis Services has a range of tools that can be used to audit access. You can use SQL Server Profiler to
run a trace against an Analysis Services instance, in addition to an Audit Login and Logout event. The
Query Logging feature that is available within Analysis Services identifies the queries that are issued
against a server and the user name that issued the query.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-7

SQL Server Integration Services

SSIS includes an audit transformation that you can use to create additional output columns within the
dataflow that holds metadata about the package, including the PackageID, PackageName, MachineName
and UserName. In addition, if SSIS is running in project mode, then the SSIS Catalog can also be used to
extract the same information. As SSIS is an automated process for moving and transforming data, it is rare
for auditing to be set up to audit the security.

Master Data and Data Quality Services

With no formal in-built auditing, these technologies host databases within a SQL Server Database Engine
instance. You can use SQL Server audit to track access to these databases.

Question: In what scenarios would auditing be appropriate?


MCT USE ONLY. STUDENT USE PROHIBITED
3-8 Managing Business Intelligence Security

Lesson 2
Security Components
Some common components are configured by an operations team that manages the access to data.
Authentication and authorization are the most common components that will be managed. If best
practice is employed and Windows groups are predominantly used, this will mean that, as new users join
the organization, they will be added to the correct groups that would already have access to the SQL
Server. In this scenario, the BI operations team would only have to deal with exceptions, which should be
considered on a case-by-case basis—members of the team may then have to make security changes.

Lesson Objectives
At the end of this lesson, you will be able to implement:

 SQL Server authentication modes.

 SQL Server access control.


 SQL Server encryption.

 Policy-based management.

SQL Server Authentication Modes


Authentication is the process of determining
whether a user has access to a server or service.
There are two main ways that authentication can
occur within SQL Server. It can be achieved by
using Windows authentication, which utilizes the
security information that is stored within Active
Directory. Alternatively, SQL Server authentication
can be used, where user name and password
information is stored within SQL Server.

The SQL Server Database Engine can be


configured within the instance properties to
support one of two authentication modes:

 Windows Authentication mode.

 SQL Server and Windows Authentication mode.

Some BI components support both modes, such as the database engine and Reporting Services. Other BI
technologies only support one authentication mode. For example, you can only authenticate to
Integration Services using Windows authentication.
In addition, Reporting Services extends the authentication capability to use either basic authentication or
a custom forms-based authentication. As Reporting Services is a web application, the extensibility of the
authentication enables it to be embedded within custom applications that may use other forms of
authentication.

The BI operations team must decide on a model. SQL Server and Windows Authentication mode should
be used to enable any existing third-party applications to operate. For high security networks, Windows
Authentication mode may be considered to apply security under the control of the Active Directory
network.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-9

SQL Server Access Control


Controlling access to the SQL Server occurs at
both the server level, and the database level.

Server Level

You can create objects that are used to control


access to the server, and define the level of access
and permissions to the server. The common
objects to create and manage are logins and
server roles—these allow you to create and
manage access to the SQL Server and control the
permissions based on the built-in or user-defined
server roles that are available.
When creating logins, you can choose to use a Windows login, which is integrated into Active Directory,
or a SQL login, where the user account information is stored on the SQL Server itself. It is best practice to
use Windows groups in a server login as much as possible, and use SQL logins when you cannot use a
Windows group or user. An example of this is when you install a third-party application that will have no
knowledge of the domain in which is installed—it therefore creates a SQL login to gain access to the SQL
Server. You can also use SQL Server to gain access for external users, by using asymmetric keys or
certificates that are mapped to a SQL Server login.

To create a SQL Server login with a Windows account, you should perform the following steps:

Creating a SQL Server Login


In SQL Server Management Studio, in Object Explorer, expand the Security folder of the
server instance in which you want to create the new login.
Right-click the Security folder, point to New, and then click Login.
On the General page, in the Login name box, type the name of a Windows user.
Select Windows Authentication.
Click OK.

For other BI technologies, the concept of creating a server login is not applicable. For example, no explicit
server login is required for Integration Services, Data Quality Services or Master Data Services. While
Analysis Services does not require a server login to be created, you will need to define a Windows user as
an administrator. Reporting Services also requires a Windows user or SQL login to be mapped to the
system administrator role. This means that the user can then administer the system.

Database Level

The most common objects to create and manage within a database are database users and database
roles. It is typical to map a SQL Server login to a database user, and then use either the built-in or user-
defined roles to control access to the data within a database.
MCT USE ONLY. STUDENT USE PROHIBITED
3-10 Managing Business Intelligence Security

To create a database user by using SQL Server Management Studio, you should perform the following
steps:

Creating a Database User

In SQL Server Management Studio, in Object Explorer, expand the Databases folder.
Expand the database where you want to create the new database user.
Right-click the Security folder, point to New, and then click User.
On the General page, in the User name box, type a name for the new user.
In the Login name box, type the name of the SQL Server login to map to the database
user.
Click OK.

You can gain access to other BI services by adding a Windows user or SQL login to a built-in or user-
defined group within the technology—or, as with Analysis Services, permissions can be assigned directly
to a user.

Permissions

Access to objects within any of the SQL Server BI technologies is controlled by permissions that can be
assigned as follows:

 They can be assigned explicitly against the user of a role.

 They are assigned implicitly when a user is made a member of a role.

When first installed, each SQL Server BI technology contains built-in roles, and the approach of role-based
permission management follows best practice for controlling access. For example, Reporting Services
contains the System Administrator and System User roles at the site level. At the content level, there are
built-in roles that include Content Manager and Browser. Each of these roles contains preset permissions
that determine a user’s level of access. It is best practice to make use of the built-in roles as much as is
possible. If a role does not meet the needs of a BI operations team, it can create a new role, and then
assign the required permission before adding the users.

SQL Server Encryption


SQL Server hosts cryptographic objects that can be
used to protect data, create logins and database
users, or sign SQL Server objects—such as stored
procedures—so that they cannot be modified. The
core cryptographic objects are keys and
certificates; you must create these first so that they
can be used in the activities previously outlined.

Keys and certificates


Certificates are digitally signed objects that
associate a public key with the identity of the
system or user that holds the corresponding
private key in an asymmetric key pair. Certificates
are issued by a server known as a Certificate Authority, and are used to manage a large number of
requests for asymmetric keys without the need to maintain passwords for each request. Generally, you use
a certificate to encrypt other types of encryption keys in a database, or to sign code modules. Certificates
can also be used to create a SQL Server login or database user.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-11

Keys can either be symmetric or asymmetric. When using a symmetric key, the same key is used to
encrypt and decrypt the data. As a result, the key must be shared by the user or system that encrypts the
data, and by the user or system that decrypts the data. Symmetric keys can be created and protected by a
password or a certificate.

To create a symmetric key, you can perform the following Transact-SQL code:

Creating a symmetric key using the 256-bit Advanced Encryption Standard (AES) that is encrypted
with a password
CREATE SYMMETRIC KEY [SK_MIA-SQL] WITH ALGORITHM = AES_256
ENCRYPTION BY PASSWORD = 'Pa55w.rd';
GO

Asymmetric keys consist of a pair of keys, each of which generates a pair of values. One of the keys is
known as the public key and is associated with another key, known as a private key, which is held by the
private key owner.

To create an asymmetric key, you can perform the following Transact-SQL code:

Creating an asymmetric key using the 2048 bit RSA encryption algorythm that is encrypted with a
password
CREATE ASYMMETRIC [KEY AK_MIA-SQL]
WITH ALGORITHM = RSA_2048
ENCRYPTION BY PASSWORD = 'S4nFr4nc1sc0';
GO

Extensible Key Management


SQL Server allows the use of third-party encryption keys and management applications in a feature
named Extensible Key Management. This feature enables the encryption key to be stored outside the
database in a Hardware Security Module (HSM). This can offer a more secure solution because the
encryption keys do not reside with the encrypted data—a benefit that can meet the growing demand for
regulatory compliance and concern for data privacy.

Typically, you would have to install the third-party EKM software that would place an object within
cryptography providers. You would then perform additional steps to encrypt the data at rest, and this is
typically used with a feature known as transparent data encryption (TDE).

Transparent Data Encryption

Transparent data encryption is enabled on a database level, and performs real-time encryption and
decryption of the data and log files without the need to rewrite applications. A database encryption key
(DEK) is stored in the database boot record for availability during recovery. The DEK is a symmetric key
secured by using a certificate stored in the master database of the server or an asymmetric key protected
by an EKM module.

To use TDE, you should perform the following steps:

1. Create a master key.


2. Create or obtain a certificate protected by the master key.

3. Create a database encryption key and protect it with the certificate.

4. Set the database to use encryption.


MCT USE ONLY. STUDENT USE PROHIBITED
3-12 Managing Business Intelligence Security

Protecting network traffic

The transfer of data over a public network, from one site to another, would require protection because
data is being transferred across the Internet. For this reason, Windows Server provides the capability, using
the Routing and Remote Access Server, to create site-to-site virtual private network (VPN) tunnels—when
the VPN is created, encryption protocols can be defined to secure the connection. The IPSec protocol is a
VPN encryption protocol that you can use to protect data over the wire. The configuration of the secure
VPN would be performed by the network administration team. This should be considered if the
movement of data takes place from different sites, when it is being loaded into a data warehouse.

Demonstration: Setting Up Transparent Data Encryption


Demonstration Steps
1. Open SQL Server Management Studio, and connect to the MIA-SQL database engine instance.

2. Use Transact-SQL to:

 Create a database master key that is encrypted with the password S4nFr4nc1sc0.
 Create a certificate named ServerCert with the description used for TDE.

 Create a database encryption key that uses the AES_128 algorithm and is encrypted by the
certificate ServerCert.
 Enable transparent data encryption on the AdventureWorks database.

Your code should like this:

USE master;
GO
CREATE MASTER KEY
ENCRYPTION BY PASSWORD = 'S4nFr4nc1sc0';
GO
CREATE CERTIFICATE ServerCert
WITH SUBJECT = 'Used for TDE'
GO
USE AdventureWorksDW
GO
CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_128
ENCRYPTION BY SERVER CERTIFICATE ServerCert
GO
ALTER DATABASE AdventureWorksDW
SET ENCRYPTION ON
GO

3. Execute the code and read the warning message.

4. Close SQL Server Management Studio without saving files.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-13

Policy-based Management
Policy-based management enforces the
configuration of different aspects of one or more
instances of SQL Server. After a policy is defined,
SQL Server enforces the settings within that policy.
This can include forcing the naming conventions
of SQL Server objects and forcing the recovery
model of a database. Policy-based management
can be used to enforce SQL Server standards for
the data warehouse that are in line with the BI
operations team server standards.

Policy-based management consists of the


following components:

 Policy. A policy is an object that holds the information required to enforce a policy. A policy consists
of a facet, which represents a SQL Server object, and a condition, which is applied to an object. Finally,
a target is defined—this is typically a condition that defines a server name.
 Facet. A facet is an object within a policy that contains properties that relate to a specific SQL Server
object. Facets can include objects such as databases, views, and stored procedures. They can also
represent surface area configuration for SQL Server components, such as the database engine,
Analysis Services and Reporting Services.

 Condition. A condition defines a set of allowed states for a facet against a given target. A policy can
consist of only one condition.
Policies can be imported and exported between different instances of SQL Server, in an xml format. You
can replicate policy settings on other instances without the policy being recreated manually.
Question: Does your organization user Active Directory groups or Active Directory users to
access SQL Server BI resources?
MCT USE ONLY. STUDENT USE PROHIBITED
3-14 Managing Business Intelligence Security

Lesson 3
Security Approach for BI Components
Defining a security approach for each BI technology has the benefit of establishing a baseline on which
troubleshooting can be centered, in addition to setting a standard. Common standards should be
followed across all BI technologies, but each technology also has its own unique considerations that need
to be taken into account.

Lesson Objectives
In this lesson, you will describe the security approach for:

 Database Engine

 Integration Services

 Analysis Services

 Reporting Services
 Data Quality Services

 Master Data Services

Database Engine
The operational security approach for server and
database access should be conducted using the
following guidelines:

Determine the authentication model to use

It is best practice to use Windows user accounts


where possible so, for that reason, both
authentication options include Windows
Authentication. If a SQL Server login is set to
Windows Authentication only, it will not gain
access to the server; applications that use a SQL
login will be unable to access the SQL Server
instance.

Use Windows groups as SQL Server logins to ease administrative effort

Windows logins can take advantage of Windows groups to organize users into logical groupings. Placing
Windows logins into a Windows group will simplify the administrative effort of managing users for the BI
operations team. Active Directory administrators are responsible for the management of Windows users
and groups. This means that the BI operations team will only be responsible for deciding which user or
group can access the SQL Server instance. Using Windows groups will further simplify the management
from a SQL Server perspective, because there will be less objects to manage.

Create a SQL Server login for each user or application that requires access to the server

Should SQL Server logins be required, you should create a separate SQL login. This will help with the
auditing capabilities of SQL Server.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-15

Ensure SQL logins are mapped to database users

When a login is created in SQL Server, you can map the SQL login to a database user. The BI operations
team could potentially encounter support issues for users who believe they should have access to a SQL
Server database, but are unable to do so. This scenario occurs when a SQL login is deleted but the
associated database user remains. These users are then referred to as orphaned users.
Make use of the built-in roles as much as possible

SQL Server provides a wide range of built-in server and database roles that contain a default set of
permissions to control access to resources. For example, the database role db_datareader enables users to
read data in the database. The objective is to simplify the management of access control. It is best practice
not to modify the default permissions for built-in roles; if a built-in role does not meet your access control
requirements, you should create a new role and document it as an accepted standard.

Manage permission assignment and exceptions through a help desk management tool

Requests for security access should be managed through a help desk management tool. The BI operations
team can use this to record details of a request and their response to it. Exceptions in permissions requests
that are granted should be recorded in an exceptions document, along with some evidence of a signed-
off business approval.

Enable auditing to meet legislative and compliance requirements


Other auditing should be conducted on a case-by-case basis because of the impact on managing the
audit log files and the overhead placed on servers to conduct extensive auditing.

Determine whether encryption is to be used within a database


Encryption can be applied at numerous levels of the database. Transparent Data Encryption and Always
Encrypted can be used to protect the entire database. An alternative is to use column-based encryption to
protect specific information, such as credit card details.

The decisions that are made regarding the guidelines should provide a baseline that can be incorporated
into standards.

SQL Server Integration Services


Integration Services provides different capabilities
that allow you to control access to the contents in
a package; you can control who can execute and
edit packages; and you can secure the files that
are used by the package. The following best
practice should be employed to protect the
Integration Services package:

Deploy packages to SQL Server


You can use integration to deploy an Integration
Services package to either the file system or SQL
Server. Deploying to SQL Server when in project
model deployment stores the package within the
SSIS Catalog. Or if the package deployment model is used, the msdb database stores the integration
services packages.
MCT USE ONLY. STUDENT USE PROHIBITED
3-16 Managing Business Intelligence Security

The benefit of deploying to SQL Server is that it provides an additional layer of security that can be used
to protect the packages. The file system deployments rely only on the NTFS security that is provided by
Windows.

Use package roles to manage SSIS packages

The SSIS Catalog provides an ssis_admin role to manage all SSIS operations, and an ssis_logreader role to
enable members to view the logs that are generated by SSIS.

For the msdb database, there are three fixed database-level roles—db_ssisadmin, db_ssisltduser, and
db_ssisoperator—for controlling access to packages that are saved to the msdb database. By default, the
sysadmin SQL Server role is a member of the db_ssisadmin role. The db_ssisltduser role can view all
packages but only manage their own packages; the db_ssisoperator role can only view the packages
stored in the MSDB database.

Determine the package protection level to use

Package protection levels control access to the contents that are held within a package. This can involve
securing the entire package contents or only sensitive data, such as connection information. The
mechanism for providing this protection can be achieved by using a password or user key, or relying on
SQL Server storage.

Digitally signed packages and very sensitive packages


If you need to control access, digital signatures can be assigned to a package. This involves obtaining a
security certificate, and then performing the following steps:

1. In SSIS Designer, on the SSIS menu, click Digital Signing.


2. In the Digital Signing dialog box, click Sign.

3. In the Select a Certificate dialog box, select a certificate.

4. Click OK to close the Select a Certificate dialog box.


5. Click OK to close the Digital Signing dialog box.

6. To save the updated package, click Save Selected Items on the File menu.
Integration Services can then be configured to check for the existence of a certificate within Tools and
Options in the Integration Services option. Within this is an option to check digital signatures when
loading a package.

SQL Server Analysis Services


Analysis Services operates a similar security model
as the database engine. One key difference
between the two is that Analysis Services uses
Windows Authentication only as the basis of its
security model to authenticate users.
Windows users can then be made a member of a
role to which permissions can be assigned for a
role that controls access to an OLAP database,
data sources, a cube, dimensions, dimension
members, and specific cells within the cube.
Analysis Services in Multidimensional mode can
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-17

also be used to secure data mining structure and data mining models.

The following guidelines should be followed:

Manage the membership of the server administrator role

The BUILTIN\Administrators Windows group is a member of the Analysis Services administrators’ role by
default. In other versions of SQL Server, no Windows groups are added to this role. As a result, during the
installation of Analysis Services, you can configure the member of this role so that only the intended
administrators are made a member. If not performed during the installation, this can be updated within
SQL Server Management Studio.

Define the standards for creating database roles

You can use SQL Server Analysis Services to create roles in both Visual Studio and SQL Server
Management Studio.

Creating roles within SQL Server Management Studio means changes are performed on the deployed
system and will take effect the next time the user connects to the Analysis Services instance. Note that the
change that is made by using SQL Server Management Studio will not be reflected within the Visual
Studio project—it could result in the roles being overwritten or removed should the deployment of the
project define that role members are dropped and created. As a result, you can define the standards for
the creation of database roles within Analysis Services.

Avoid the use of the deny permission

If a role provides access to an object, then a member of that role has access to the object, regardless of
whether they are explicitly denied access to the object in another role. As a result, the deny permission is
overridden.

SQL Server Reporting Services


Authentication to the report server is dealt with by
using Windows or SQL Server integrated security.
After authentication has been established,
authorization to resources within the report server,
such as reports, can be managed using role-based
security. When you are defining security for
Reporting Services, the following best practices
should be followed:

Use Windows logins for increased security


Using Windows logins will provide the security
benefits that are a feature of Active Directory.
Management of the users can be handled by the
Active Directory team, leaving the BI operations team to focus on which groups should gain access to the
report server.

Use predefined system and item level roles to fulfill security requirements

In Reporting Services, two default system roles are created—system administrator and system user. The
available default item level roles include Browser, Content Manager, Publisher, My Reports, and Report
Builder. These roles should be used as much as possible to facilitate permissions management. These roles
should also be used to fulfill as much of the access control as possible.
MCT USE ONLY. STUDENT USE PROHIBITED
3-18 Managing Business Intelligence Security

Do not modify the predefined roles

You can create user-defined roles for security requirements that cannot be met by the predefined roles.
Predefined roles should not be modified as this can increase the complexity of resolving access control
issues.

Data Quality Services


The security model for Data Quality Services (DQS)
is based on the SQL Server login security model,
with security management occurring within the
DQS_Main database in SQL Server Management
Studio. In this database, three database roles are
used to define the default access control,
including:

 dqs_administrator – a role that can manage


all aspects of DQS.

 dqs_kb_editor – a role that can perform all


activities except administering DQS.

 dqs_kb_operator – a role that can manage DQS projects.

It is best practice to make use of these roles to manage access control of the DQS.

Master Data Services


The security of Master Data Services (MDS) is
based on local or Active Directory domain users
and groups. There are a number of different
security areas that can be configured, including:

 MDS server security

 Functional area security

 Model security

 Hierarchy permissions
Use the SQL Server administrators as the MDS
administrator

MDS server security defines which user can have full control of the server, the functional areas, models
and hierarchies. By default, this is populated with the user who has installed MDS and is part of the
administrators group on the local server. You should replace this account with the Windows group that
represents the team that manages the operations of the Master Data Server, and then assign the
functional area of Super User.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-19

Use existing groups or create groups for each functional area

Depending on the number of users who manage and use MDS, you can create a Windows group that
represents the functional areas that this group will manage. The available functional areas include:

 Explorer

 Version Management

 Integration Management

 System Administration

 User and Group Permissions


This will control the visibility and usability of the functional area. There is a special functional area named
Super User that gives the assigned group members full permissions to MDS.

Use existing groups or create Windows groups for each model area and hierarchy data
An MDS model is a container for MDS entities that store the master data. For MDS estates that are used
by multiple users, you can use existing Windows groups to control access to the model data.

Question: What is the implication of the deny permission in SSAS?


MCT USE ONLY. STUDENT USE PROHIBITED
3-20 Managing Business Intelligence Security

Lesson 4
The Security Approach in Different BI Environments
Different environments will have differing security conditions to meet the required functional needs.
Production environments will be subjected to tight security policies to ensure protection of the data and
technologies on which the data is hosted. In a nonproduction environment, such as a development
environment, users may have elevated permissions to ensure that they can successfully carry out their role
of developing BI solutions.

Lesson Objectives
At the end of this lesson, you will be able to describe security in:

 Production environments.

 Nonproduction environments.

Production Environments
The production environment should be subjected
to the tight security controls—signed off by the
business—that have been outlined in this module.
Not all practices that have been covered in this
module may be used, but there should be clear
documentation that outlines the security in the
following areas:

 Physical security
 Authentication

 Authorization

 Encryption requirements
 Auditing policies

 SQL Server Database Engine

 SQL Server Integration Services

 SQL Server Analysis Services

 SQL Server Reporting Services

 Master Data Services

 Data Quality Services

 Service maintenance

 Reporting a security breach

The BI operations team should support the BI solution in line with the security strategy that has been
documented. In addition, exceptions to the standards defined within the security documentation should
also be documented and signed off.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-21

Nonproduction Environments
When you consider the security requirements for
nonproduction environments, you need to achieve
a balance between providing protection for the
data and making sure that users can perform their
role. Each business will take advice from their
security team before deciding how they should
approach security.

Developers should be given an appropriate


environment in which they can create the BI
solution. For many organizations, this requires the
developers to have full administrative rights to
both the Windows and SQL Server technologies. If
the data that the developers are working against is manufactured—or in other words, is not production
quality data—the concern to protect the data will be reduced.
The operational team might be concerned about the developers’ ability to install newer or third-party
technologies that are not part of the defined standards. However, it should be acknowledged that
developers sometimes require newer technologies as they try to preserve the solutions that they are
building for the future. If new installations are not wholly desirable, the operational team can use group
policy to restrict the type of software that can be installed on a desktop or a server. To ease the
administrative effort, computers can be stored in dedicated organizational units for nonproduction and
production servers, so separate policies can be applied.
By using this approach, the developers will have full access to their own servers without having to install
software that has not been sanctioned. It also means that developers must make a request to install
software; this can then be reviewed and approved by the other relevant stakeholders within the
organization.

Test environments
A test environment may contain a subset of production quality data to provide testers with real-world
scenarios. As a result, security in the test environment will be stricter than that of the development
environment. Access should only be given to those users who have to test, and those who have to validate
the tests. It is not uncommon for testers to have select, insert, update and delete permissions for the data
in these environments. A subset of developers may have select permissions for the data within these
environments.

UAT environments
UAT environments should closely replicate the settings that are used within a production environment. In
these environments, a small group of business users are invited to explore the solutions and data in the
environment, and then provide feedback. It is best practice for all users to be given a dedicated test
account to perform the testing. This ensures that there is differentiation from their production accounts
and security between the two environments is separate. Testers might have select permissions for the UAT
environment; developers typically do not receive access to the environment.

Question: Should developers be given full administrative rights to developers’ development


servers?
MCT USE ONLY. STUDENT USE PROHIBITED
3-22 Managing Business Intelligence Security

Lab: Managing Business Intelligence Security


Scenario
Adventure Works Cycles is a global corporation that manufactures and sells bicycles and accessories. The
company sells through an international network of resellers, and has a direct sales channel through an e-
commerce website.

You have been asked to demonstrate some of the key security points raised with the team at Adventure
Works. Specifically, you should show the difference between a Windows and a SQL login, and how to
create a database user from a login. You have also been asked to demonstrate how to set up security in
Analysis Services and Reporting Services.

Objectives
After completing this lab, you will be able to:

 Set up security on a SQL Server database.

 Set up security with Analysis Services.


 Set up security within Reporting Services.

Estimated Time: 45 minutes

Virtual machine: 10988C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa55w.rd

Exercise 1: Setting Up Security in SQL Server


Scenario
You have been asked to demonstrate to the Adventure Works BI operations team the range of security
options that are available within a SQL Server instance and a database. This will involve setting up the
authentication model, setting up Windows and SQL logins, and then mapping a login to a database and
assigning the user to a built-in database role.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Set the SQL Server Authentication Mode on MIA-SQL to SQL Server and Windows Authentication

3. Create SQL Logins for the DL_ReadSalesData Windows Groups

4. Create SQL Logins for the Sales Application

5. Create Database Users in the AdventureworksDW Database from the Windows Group
DL_ReadSalesData and Grant Select Permission to the Sales Schema.

 Task 1: Prepare the Lab Environment


1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.
2. Run Setup.cmd in the D:\Labfiles\Lab03\Starter folder as Administrator.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-23

 Task 2: Set the SQL Server Authentication Mode on MIA-SQL to SQL Server and
Windows Authentication
1. Open SQL Server Management Studio.

2. Connect to the MIA-SQL SQL Server instance.

3. Set the authentication mode on MIA-SQL to SQL Server and Windows Authentication if this mode
is not already active.

 Task 3: Create SQL Logins for the DL_ReadSalesData Windows Groups


 Use Windows Authentication to create a login for the DL_ReadSalesData Windows group and ensure
that the default database is set to AdventureWorks.

 Task 4: Create SQL Logins for the Sales Application


 Use SQL Server authentication to create a SQL login named SalesApp, with a password of Pa55w.rd
and a default database of AdventureWorks. Password policies should not be enforced on this
account.

 Task 5: Create Database Users in the AdventureworksDW Database from the


Windows Group DL_ReadSalesData and Grant Select Permission to the Sales Schema.
1. Use the MIA-SQL\DL_ReadSalesData login to create a database user in the EIM_Demo database
named EIM_SalesReaders.

2. Grant Select permission to the SalesReaders user over the EDW schema in the EIM_Demo database.

Results: At the end of this exercise, you will have:

Set up the authentication model for a SQL Server instance.

Created a SQL login using a Windows group.

Created a SQL Server login.

Created a database user.

Mapped a database user to a built-in database role.


MCT USE ONLY. STUDENT USE PROHIBITED
3-24 Managing Business Intelligence Security

Exercise 2: Setting Up Security in SQL Server Analysis Services


Scenario
For this exercise, you have been asked to demonstrate how to set up security in Analysis Services. You will
create a role named DBProcess that gives the user, Gerry O’Brien, the ability to process an Analysis
Services database.

The main tasks for this exercise are as follows:

1. Opening Up a Team Foundation Server Project

2. Creating an Analysis Services Database Role

3. Testing Analysis Services Permissions

 Task 1: Opening Up a Team Foundation Server Project


1. Open Visual Studio 2017.
2. Connect to Team Foundation Server, open the AdventureWorksBISolutions collection.

3. Open the AW_BI solution.

 Task 2: Creating an Analysis Services Database Role


1. Create a role named DB Process that can process the AW_SSAS database.
2. Save the AW_BI solution and close the Role designer.

3. Deploy the AW_SSAS solution.

 Task 3: Testing Analysis Services Permissions


1. Log out from Windows as Student, and log in as ADVENTUREWORKS\GOBrien, with password
Pa55w.rd

2. Start SQL Server Management Studio

3. Connect to the Analysis Services MIA-SQL instance.


4. Process the AW_SSAS Analysis Services database.

5. Try to create a database role.

Results: At the end of this exercise, you will have:

Created a database role and added a database user within the role.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-25

Exercise 3: Setting Up Security in SQL Server Reporting Services


Scenario
You have been asked to give Gregory Webber the ability to manage the security of Reporting Services. He
must not be able to view any of the content on Reporting Services, nor be able to perform any other kind
of administration on the report server. You will create a system level role named SecurityAdmin that has
the permission to manage report security.

The main tasks for this exercise are as follows:

1. Create a System Level Role

2. Verify the System Level Role Permission

 Task 1: Create a System Level Role


1. Log out from Windows as GOBrien, and log in as ADVENTUREWORKS\Student, with password
Pa55w.rd

2. In SQL Server Management Studio, connect to the Reporting Services MIA-SQL\SSRS instance.

3. Create a new system level role named SecurityAdmin and assign the manage report server security
permissions.

4. Assign the new role SecurityAdmin to the GWebber user account.

 Task 2: Verify the System Level Role Permission


1. Log out from Windows as Student, and log in as ADVENTUREWORKS\GWebber, with password
Pa55w.rd

2. Examine the report server site settings while logged in as GWebber.

3. Sign out from Windows when you have finished

Results: At the end of this exercise, you will have:

Assigned the report security permission to Gregory Webber.

Tested the security permissions.

Question: Which best practices for security would you envisage being able to implement
when you return to your own workplace?

Question: At any point, did the lab not follow security best practice?
MCT USE ONLY. STUDENT USE PROHIBITED
3-26 Managing Business Intelligence Security

Module Review and Takeaways


In this module, you have taken a holistic look at the security that can impact the BI technologies that are
implemented. Common themes should have appeared across some of the technologies, such as using
groups to manage the security. You will also be aware of the security nuances that are specific to different
technologies.

You should now be able to:

 Describe the security approach to a BI solution.

 Understand the security components available.

 Apply the security components to BI technologies.

 Manage security in different environments.


MCT USE ONLY. STUDENT USE PROHIBITED
4-1

Module 4
Deploying BI Solutions
Contents:
Module Overview 4-1 
Lesson 1: Application Life Cycle Management for BI Solutions 4-2 

Lesson 2: Stand-alone Deployments 4-5 

Lesson 3: Team-Based Deployments 4-14 


Lab: Deploying BI Solutions 4-19 

Module Review and Takeaways 4-23 

Module Overview
Deploying BI solutions is a discrete part of the BI development life cycle. The BI operations team will be
called upon to support the development team during the deployment. The aim is to successfully create
the solution within a production environment for operational use. The presence of nonproduction
environments provides the opportunity to practice the deployments before they are conducted on a
production server, so that the deployments can run smoothly.

A variety of tools and practices can be used to aid deployments. Each method has its own benefits and
can be used in any environment. Understanding the tools that are available and the benefits they offer
will help you to pick the right tool for the job and aid deployment.

Objectives
At the end of this module, you will understand:

 Application life cycle management for BI solutions.

 Stand-alone deployments.

 Team-based deployments.
MCT USE ONLY. STUDENT USE PROHIBITED
4-2 Deploying BI Solutions

Lesson 1
Application Life Cycle Management for BI Solutions
The deployment of a BI solution does not stop with the completion of a major BI project. After users start
to gain value from the solution, the BI development team will be asked to add new functionality to further
increase that value. The job of the BI operations team is to support the development team in performing
the deployments, so that there is minimal disruption to the live solution.
Therefore, an understanding of the BI development life cycle, and how source control and Team
Foundation Server (TFS) can help with this, is important to the success of managing the BI life cycle. The BI
operations team can help to ensure that the business receives the updates in an efficient manner.

Lesson Objectives
At the end of this lesson, you will be able to describe:

 The BI development life cycle.


 The need for source control.

 How to harmonize operational management with the BI life cycle.

The BI Development Life Cycle


The BI development life cycle consists of a number
of phases that can help the delivery of BI solutions
to a production environment. At a high level, this
can be broken down into the following four
phases:

 Design. The design phase contains the


functional and nonfunctional requirements for
the solution that is to be delivered. The
requirements are provided by the BI Architect
to be delivered by the development team.
This may initially start out as a proof of
concept. During this phase, an iterative
approach may take place to refine the design. This phase is typically managed by the BI Architects
and the developers—the BI operations team may not be involved at this phase. For large scale
projects, it is advisable for a BI operational team to provide inputs into the design to add a support
perspective to the solution.
 Develop. The developers use the available tools and technologies to deliver the design that has been
provided by the BI Architect. At this phase, the BI operations team will support the developers by
providing the required technologies. The team may even reject technologies that are put forward.
The intention is to provide the developer team with the tools that they need to complete the work. At
this point, developers will be responsible for their own deployments to a development environment.

 Testing. Testing provides the first opportunity to test the deployment mechanisms that will be used
in all of the environments. It is important for the BI operations team to be involved at this stage.
There may be resistance to this, but the process of deploying the solution to an environment should
be tested, so that the team can identify and deal with any problems that might occur. In this phase,
the objective is for the BI operational team to define standards for deployments.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-3

 Deployment. By the time that a deployment occurs in a production environment, standards should
be set from the deployments that have taken place in nonproduction environments. There are
different methods to deploy a solution but the aim is for the deployment to be automated as much
as possible, so that it can occur seamlessly. This capability should also be practiced in a
nonproduction environment. It is also best practice to create a documented build guide and rollback
document to support the production deployment.

The Need for Source Control


For development projects that involve
modifications by multiple developers, source
control becomes an important part of managing
the BI solution files. TFS provides the capability to
manage this working practice. With TFS, you can:
 Centralize the storage of the BI solutions
files. TFS provides a central repository for
storage of the BI solution files—this means
that developers can download a copy of the
files to a local workspace located on their own
development machines. The developers can
then work on the same files independently,
and are able to check in the local copy to the TFS. Should a conflict occur in the files, you can use a
resolution tool to choose which code should be checked into the TFS.
 Use version control to store the solution files. You can use TFS to store multiple versions of the
same files. As the file is checked back in to TFS, the file is stamped with a version. You can also
provide a description against the version that is being created. The ability to create versions also
means that you can revert to a previous version or recover a deleted file.
Versions can be brought together as part of a build-and-release approach to provide a coherent way of
deploying a consolidated solution to an environment.

Harmonizing Operational Management with the BI Development Life


Cycle
The BI operations role will likely involve managing
the build and deployment of the code that is
stored within TFS. Working with the project
managers on a deployment timetable—and with
the developers to determine which files to
deploy—the BI operations team will be
responsible for defining a build, managing the
release, and checking that the release has been
successful.
This process requires a coordinated approach by
all of the team. It is typically led by a project
manager, who will define a schedule to
accommodate the development, build, and release of the code to a given environment. A variety of
methods can be used to manage the approach within a schedule. Organizations may opt for a waterfall
MCT USE ONLY. STUDENT USE PROHIBITED
4-4 Deploying BI Solutions

approach to delivery, where the development of BI functionality occurs sequentially, and the output is for
a discrete portion of functionality, which is clearly defined against a given schedule. Alternatively, an agile
approach may be used by organizations to deliver a plethora of functionality across different parts of a
team at the same time.
Regardless of the approach, it is prudent to determine a period within the schedule where there is a code
freeze. The BI operations team can then collate the checked-in versions of the code into a build; produce
a build guide that describes the actions to take during the deployment; and then continue with a release
by deploying the code through a variety of methods. This does not stop developers from working,
providing they do not check in new BI functionality during the code freeze. TFS provides the functionality
for developers to “shelve” changes. In this scenario, development efforts are saved to the TFS server, but
are not committed as a version to the TFS database. This means that the BI operations team can continue
to manage a release and build without disruption to the developers.

The BI operations team can use a range of technologies to manage deployments of the solution. It is
important that this team communicates the process and the steps for the deployment, in addition to any
rollback strategy that may be required, should a deployment fail.

Question: How do you manage the deployments of BI solutions?


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-5

Lesson 2
Stand-alone Deployments
Stand-alone deployments are useful to developer teams that are deploying to nonproduction
environments, or for single individual teams deploying to a production environment. SQL Server and
Visual Studio® provide a wide range of tools that can support stand-alone deployments of both
databases and BI components. You can also use automation to automate the process of deployment
through a range of languages.

Lesson Objectives
At the end of this lesson, you will be able to perform:

 DACPAC deployments

 BACPAC deployments

 Visual Studio deployments


 Scripted deployments

Data Tier Deployments


In the development life cycle, a normal activity is
to create a database, such as a data warehouse or
supporting databases, and distribute them to
other environments. This is common in
development environments when a developer
wishes to share development databases with
colleagues, or wants to provide a database to
another environment, such as a testing
environment. This can be achieved by using a Data
Tier Application Package (DACPAC).

A DACPAC is a self-contained unit of database


deployment that houses the schema of database
objects such as tables, views, stored procedures and other SQL Server objects that can be deployed to
another SQL Server instance. DACPACs that are generated in the latest release of SQL Server can also be
released to earlier versions; the DACPAC wizard will advise of any incompatibility issues with earlier
versions before the deployment occurs. Specifically, a DACPAC can be deployed to an instance of the
database engine running SQL Server 2005 Service Pack 4 (SP4) or later. However, you should be mindful
that, if a DACPAC is created on a newer version of SQL Server, it may create objects that are not
supported by earlier versions.

A BI developer can create a database on their local instance. After the database creation is complete, the
following steps can be performed to create a data tier application for deployment to another instance:
1. Open SQL Server Management Studio and connect to the instance that contains the database.

2. In Object Explorer, expand the Database node, right-click a database, point to Tasks, and click
Extract Data-tier Application.

3. In the Extract Data-tier Application window, on the Introduction page, click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
4-6 Deploying BI Solutions

4. On the Set Properties page, type a name for the application, and then type a version number under
Version; optionally, you can add a description, and then under Save to DAC package file, browse to
a location to store the file. Optionally, you can select the check box to overwrite an existing package
with the same name, and then click Next.

5. On the Validation and Summary page, click Next.


6. On the Build Package page, when the build is complete, click Finish.

To deploy a data tier package, you should perform the following steps:

1. Open SQL Server Management Studio and connect to the instance that contains the database.

2. In Object Explorer, right-click the Database node, and click Deploy Data-tier Application.

3. In the Deploy Data-tier Application window, on the Introduction page, click Next.

4. On the Select a Package page, browse and select the dacpac package previously created, and click
Next.

5. On the Update Configuration page, under Name, optionally change the name of the database, and
then click Next.
6. On the Summary page, click Next.

7. On the Deploy DAC page, wait for the deployment to complete, and then click Finish.
On completion, the data tier application deployment will create the database and the objects that are
contained within the database, but it will not contain data.

BACPAC Deployments
If a developer is concerned about deploying tables
and their associated data, an alternative approach
to using a DACPAC is to create a BACPAC. The
fundamental difference between the two is that a
BACPAC contains both the schema and the data
when created and deployed.
To create a BACPAC, you should perform the
following steps:
1. Open SQL Server Management Studio and
connect to the instance that contains the
database.
2. In Object Explorer, expand the Database node, right-click a database, point to Tasks, and click
Export Data-tier Application.

3. In the Export Data-tier Application window, on the Introduction page, click Next.

4. On the Export Settings page, in the settings tab, select the radio button next to either Save to local
disk, or Save to Windows Azure™, and browse to a storage location.

5. Optionally, on the Advanced tab, select the table to export.

6. On the Summary page, click Finish.

7. On the Operation Complete page, when the build is complete, click Close.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-7

To reuse the BACPAC on another SQL Server instance, you should perform the following steps:

1. Connect to the instance of SQL Server, whether that is on-premises or in the Azure SQL Database.

2. In Object Explorer, right-click Databases, and then select the Import Data-tier Application menu
item to launch the wizard.

3. On the Introduction page, click Next.

4. On the Import Settings page, select the radio button next to either Import from Local Disk or
Import from Windows Azure, and browse to a storage location, and then select the BACPAC. Click
Next.

5. On the Database Settings page, change the name of the database, and then click Next.

6. On the Summary page, click Finish.

7. When the BACPAC is imported, click Close.


BACPAC only supports the following type of database objects:

 Inline table-valued function

 Multistatement table-valued function


 Scalar function

 Clustered index

 Nonclustered index
 Spacial index

 Unique index

 Logins
 Users

 Database roles

 Role memberships
 Permissions

 Schemas

 Statistics

 Transact-SQL stored procedures

 Synonyms

 Check constraint

 Collation settings

 Columns

 Computed columns
 Default constraint

 Foreign key constraints

 Primary key constraints

 Unique constraints
MCT USE ONLY. STUDENT USE PROHIBITED
4-8 Deploying BI Solutions

 DML triggers

 Tables

 Views

 HIERARCHYID, GEOMETRY, GEOGRAPHY data types

 User-defined data type

 User-defined table type

If a database contains objects that are not supported, then an error is returned in the wizard.

Visual Studio Deployments


You can use Visual Studio to deploy a range of BI
projects from a stand-alone environment,
including:

 SQL Server Database projects.

 SQL Server Integration Services projects.

 SQL Server Analysis Services projects.

 SQL Server Reporting Services projects.

For this to be completed and deployed


successfully, settings to determine a number of
different properties must be defined within the
specific project.
SQL Server Database Projects

The deployment settings for a database project can be defined by right-clicking the database project, and
then clicking Properties.
In the Project Settings page, you can define the following settings:

 Target platform. This defines the version of the SQL Server platform that the project files are
intended for.

 Output types. This provides the ability to generate a DACPAC file and/or a SQL script that can be
used to deploy to an instance of SQL Server in the Project Settings page.

 General. This defines the default schema in the database and whether to include the schema name in
the filenames that are generated.

You can use the additional pages that are available to define how to manage SQLCLR, debugging levels
and builds with pre- and post-deployment scripts that are important to team deployments.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-9

SQL Server Integration Services

SQL Server provides the capability to perform incremental package deployments. This ensures that
selected packages can be deployed from a project, rather than having to have all of the packages stored
in a project. The process to create a deployment .ispac file still exists and requires the SSIS project
properties to be set to define the location for the .ispac file by performing the following steps:
1. In Visual Studio, right-click the project, and click Properties.

2. Note that in the General page under Configuration Properties, in the Configuration drop-down list,
it states Active(Development).

3. In the Build page, specify the output path, which by default is set to bin.

4. In the Deployment page, set the Server Name and the Server Project Path.

5. Optionally, set additional options in the Debugging page, and click OK.

6. Right-click the SSIS project, and click Build.

After deployment, you can browse to the output path in the project file location. If the default value is
used, a folder named “bin” will appear. Within the bin folder, another folder named “development” will
appear, to reflect the configuration that was formed. The .ispac file will now be ready to be deployed.

To deploy the SSIS solution, double-click the .ispac file and perform the following steps:

1. In the Introduction page, click Next.


2. In the Select Source page, in the drop-down list, click the deployment model as Package
Deployment.

3. In the Packages Folder Name, browse to the location of the SSIS project files to display a list of
packages, and then select the packages to deploy. Optionally, specify any passwords and click Apply,
click Refresh , and then click Next.

4. In the Select Destination page, type the server name and path, then click Next.

5. In the Review screen, click Next to start the deployment.

6. In the Completion page, click Finish.

SQL Server Analysis Services

Visual Studio can be used to deploy an Analysis Services project—it builds a complete set of XML files in
an output folder that has the commands required to build all of the Analysis Services database objects in
the project. A number of properties can be set in the Analysis Services project properties, under the
Deployment page, including:

 Process Options. This option determines whether the cube is processed as it is deployed.

 Transactional Deployment. This determines whether the deployment is a transaction and will roll
back the deployment should it fail.

 Deployment Mode. This specifies whether all, or only the changes, of the Analysis Services objects
are deployed.

 Server. This is the name of the server to which the Analysis Services solution is deployed.

 Database. This is the name of the database in which the Analysis Services solution is deployed.
Alternative methods for deploying an Analysis Services solution include the Analysis Services Deployment
Wizard, Backup and Restore of Analysis Services databases, and XMLA scripting.
MCT USE ONLY. STUDENT USE PROHIBITED
4-10 Deploying BI Solutions

SQL Server Reporting Services

Some properties must be set to ensure the successful deployment of reports. Some of these options can
be defined within the report wizard. For reports that are not created within the report wizard, you can use
the reporting services project properties to set the same and additional options by right-clicking an SSRS
project and clicking Properties. Within the deployment options there are a number of settings, including:
 Overwrite data sources | Overwrite Datasets. By default, this is set to false. If set to true, any data
sources or datasets that are edited within the report project will overwrite any existing data sources or
datasets on the report server.

 TargetDataSourceFolder | TargetDataSetFolder | TargetReportFolder | TargetReportPartFolder.


This defines the name of the folder that will hold the data sources, datasets, reports and report parts.

 TargetServerURL. This is the web address for the report server on which the reports will be located.
Note that, when deploying to SharePoint® services, you must specify a URL with the report folder
and the data sources folder.

 TargetServerVersion. This specifies the version of Reporting Services that the reports are intended
for.

Once the settings are complete, you can right-click the SSRS project and click Deploy.

Deploying Data Quality Services and Master Data Services


Data Quality Services
Data Quality Services uses a knowledge base to
store the cleansing and deduplication rules for
handling known data errors. During the BI
development life cycle, the creation of the rules
that are used to inform Data Quality Services may
be created on a server in a nonproduction
environment—to ensure that the defined rules
provide the required output. The BI operations
team can then work with the data curators to
promote the knowledge base from a
nonproduction environment to a production
environment, using the export and import functionality.

To export a knowledge base, you should perform the following steps:


1. Start Data Quality Client.

2. Connect to the SQL Server instance of Data Quality Services.

3. In the Data Quality Client home screen, open a knowledge base in the Domain Management activity.

4. In the Domain Management page (with any tab selected), click the Export Knowledge Base data
icon above the Domain list, and then click Export Knowledge Base.

5. In the Export to Data File dialog box, go to the folder in which you want to save the file. Name the
file or keep the knowledge base name, keep DQS Data Files (*.dqs) as the Save as type, and then click
Save.

6. In the Export Knowledge Base dialog box, click OK.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-11

To import a knowledge base, you should perform the following steps:

1. Start Data Quality Client.

2. Connect to a SQL Server instance of Data Quality Services.

3. In the Data Quality Client home screen, click New Knowledge Base.

4. Enter a name for the knowledge base.

5. Click the down arrow for Create Knowledge Base from, and then select Import from DQS file.

6. Click Browse to select the data file.

7. In the Import from Data File dialog box, go to the folder that contains the .dqs file that you want to
import, and then click the name of the file. Click Open and then click Next.

8. Select the activity that you want to perform, and then click Create.

9. Click Publish to publish the knowledge in the knowledge base, and then click OK.

10. In the Data Quality Services home page, verify that the knowledge base is listed under Recent
knowledge bases.

Master Data Services

Master Data Services provides a number of tools with which you can manage the movement of a model
between different instances of SQL Server, depending on your requirements. If you need to move both
the model structure and its data, you can use the MDSModelDeploy tool to create a package. You can
then use this package to create a new model, create a clone of a model, or update an existing model and
its data. This will affect the command used when deploying the model to the server. If the requirement is
to only move the structures, you can use the Model Deployment Wizard.

To create a package using the MDSModelDeploy tool, you should perform the following steps:

1. Open the command prompt, using Run as Administrator.


2. Go to the MDSModelDeploy.exe file located by default in C:\Program Files\Microsoft SQL
Server\140\Master Data Services\Configuration.

3. To determine the name of the service you will deploy to, type MDSModelDeploy listservices in the
command prompt, and then press Enter.
4. To create a package named Agents, using MSModelDeploy, from the model named Insurance, using
the PreProd version from the service named MDS that includes data, type the following command:
MDSModelDeploy createpackage -modelname Insurance -version PreProd -service MDS -
package Agents –includedata, and then press Enter.

To deploy a package to another server using the MSModelDeploy tool, you should perform the following
steps:
1. Open the command prompt, using Run as Administrator.

2. Go to the MDSModelDeploy.exe file located by default in C:\Program Files\Microsoft SQL


Server\140\Master Data Services\Configuration.

3. To determine the name of the service you will deploy to, type MDSModelDeploy listservices in the
command prompt, and then press Enter.

4. To deploy a new model, type the following command in the command prompt:
MDSModelDeploy deploynew -package Agents -model Insurance -service DefaultWebsite
MCT USE ONLY. STUDENT USE PROHIBITED
4-12 Deploying BI Solutions

5. To clone the model from the package, type the following command in the command prompt:
MDSModelDeploy deployclone -package Agents

6. To update an existing model, type the following command in the command prompt:
MDSModelDeploy deployupdate -package Agents -version PreProd

Scripted Deployments
You may want to make as many deployments to a
production server as automated as possible—to
remove the human interaction that could lead to
mistakes being made. A number of options are
available to facilitate this, including:

Using scripts to build databases and database


objects

A variety of scripting language can be used to


build the supporting databases and objects for a
BI solution, should you not make use of database
projects within Visual Studio.

For example, you could use SQLCMD to refer to sql scripts that can be executed in a specific order. If the
development team is disciplined enough, they may keep a script of all the database objects that have
been created. In this case, a SQLCMD command can be used to call the sql scripts one by one in an
orderly manner. Alternatively, if a naming scheme has been defined for the sql scripts, they could be
executed in bulk.
For example, all tables created in a Staging database may have sql files that begin with STG stored in a
folder named C:\DBObjects\Staging. This means that a batch file could be created and stored in the
C:\DBObjects\Staging folder that executes all of the files beginning with STG against the local server in a
database named Staging, using the following command:

For %%G in (STG*.sql) do sqlcmd /S localhost /d Staging –E –I”%%G”


The %%G value in the batch file acts as a variable, passing in any sql script name that begins with STG to
be executed. This command can then be saved in a file with a .BAT extension to be executed when the
database objects require recreating. PowerShell™ can also be used to automate the process of creating
databases and the associated objects as an alternative.

Scripting Analysis Services

You can generate Extensible Markup Language for Analysis Services (XMLA) scripts by using the
deployment wizard that allows you to execute the script in SQL Server Management Studio, or use the
ASCMD utility to execute the XMLA in a scripted manner. The XMLA script created from the deployment
wizard will recreate the database objects that are defined within the script file.

The XMLA script consists of settings that you can use to create the Analysis Services objects. It could also
contain the settings required to process the Analysis Services database and the objects found in the script.
You can use any editor to edit the XMLA script to add custom objects through the XMLA language, when
you have stored the XMLA script in a saved file.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-13

Using the Analysis Services Deployment wizard to create an XMLA script

To create an XMLA script using the deployment wizard, perform the following steps:

1. Click Start, type Deployment Wizard, and then click the Deployment Wizard icon.

2. On the Welcome to the Analysis Services Deployment Wizard page, click Next.
3. On the Specify Source Analysis Services Database page, browse to the folder location in the
Database file box and select the .asdatabase file, and then click Next.

4. On the Installation Target page, change the value in the Database box to define the name of the
database, and then click Next.

5. On the Specify Options for Partitions and Roles page, you can optionally create partitions and
roles, and then click Next.

6. On the Specify Configuration Properties page, specify any required configuration options and click
Next.

7. On the Select Processing Options page, select the desired processing options and click Next.
8. On the Confirm Deployment page. Select the Create deployment script check box, browse to a
location to store the XMLA file, and then click Next.

9. On the Deploying Database page, wait for the deployment script to be completed, and then click
Next.

10. On the Deployment Complete page, click Finish.

11. The XMLA script file will appear in the folder defined in the wizard.

Scripting SSIS Deployment

The dtuil command can be used to manage SSIS packages. This can include moving and copying
packages to SQL Server or to a folder location in Windows. You can also use the tools to delete the SSIS
package. When moving SSIS packages to SQL Server, you might be required to define credentials in the
command; without this, dtutil will attempt to execute the command as the user executing the command.

The following example uses dtuil to copy a package named DWLoad that is stored in the msdb database
on a local instance of SQL Server using Windows Authentication to the SSIS Package Store in a folder
named ETL.

Using dtutil to copy a SSIS package in SQL Server


dtutil /SQL DWLoad /COPY DTS;ETL\DWLoad

This example copies a local file system SSIS package named DWLoad located in the C: to an instance of
SQL Server named Seattle.

Copying a SSIS package from the file system to SQL Server.


dtutil /FILE c:\DWLoad.dtsx /DestServer Seattle /COPY SQL;DWLoad

Question: How do you handle the creation of databases in nonproduction environments?


MCT USE ONLY. STUDENT USE PROHIBITED
4-14 Deploying BI Solutions

Lesson 3
Team-Based Deployments
The ability to deploy a BI solution to different environments is important to the BI development life cycle.
This operation can be done by using the build and release management capability of TFS. After you have
become familiar with this, you can streamline deployments to different environments using the same code
base. This will provide consistency in the solutions, and is particularly important for UAT and production
environments.
Deployments can sometimes go wrong so it is important that the management of builds and releases is
supported with an appropriate rollback strategy. Having a rollback strategy in place will ensure that you
can bring an environment back to a known good state if required.

Lesson Objectives
In this lesson, you will learn about:
 Team Foundation Server (TFS).

 Release management in TFS.

 Build management in TFS.


 Rollback strategies.

Team Foundation Server


TFS plays an important role in managing the
releases and builds of BI solutions from a
development team to different environments.
Many of the project properties that have been
covered in earlier topics in this module are still
relevant and require configuring for each project.
The focus at this stage is on using TFS to cover the
build and release aspects of the solution that you
have in place.
Team Explorer, the SQL Server Data Tool add-in
that enables interaction with the TFS, becomes a
tool at the centerpiece for managing the technical
aspects of the release and build cycle. However, the procedure for managing this process is often
complex, and requires being handled by a team member, such as a project manager, to coordinate the
entire efforts of the team.
When you are using TFS to manage team-based deployments, there are three main considerations:

 Release Management. This involves collating the correct code that is part of the release and
ensuring that the correct settings are defined for the environment in which the release will take place.

 Build Management. This provides the ability to create a technical definition of a build definition that
will be used to deploy the code that is defined within a release.

 Rollback Management. This involves the steps and technologies used to manage the rollback of a
deployment to a known state for the purpose of ensuring continued operations of a given
environment. Hopefully, a rollback strategy will not be required. However, if a build does fail to
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-15

deploy, the BI operations team should be able to restore the environment to an operational state as
quickly as possible.
The operations team should also collate a build guide that contains release notes—these should provide
an overview of the functionality that is part of the release. The build guide should contain the detailed
steps on how the deployment will be performed. There should also be documentation that refers to the
rollback approach that would be undertaken should the deployment fail.

Release Management in Team Foundation Server


Release management is both a process and a
service within TFS that means a project
development team can coordinate the builds of
your BI solution code to different environments.

However, the release process still needs managing.


For example, TFS enables subsets of a
development team to “branch out” their own
projects into a TFS branch. This means that a
development team can work on a separate branch
of code without affecting the code that is
contained within a main branch. The use case for
such working practices includes a separate team
working on a set of functionality that also depends on functionality built by other teams. The code needs
to be developed without affecting the other teams.

Creating a Branch in Team Explorer


Team Explorer within Visual Studio provides the capability to create a separate branch by performing the
following steps:
1. In Visual Studio, open Source Control Explorer by clicking View on the menu bar, point to Other
Windows, and then click Source Control Explorer.

2. In Team Explorer, right-click the solution or folder you want to branch, point to Branch and
Merging, and then click Branch.

3. In the Branch dialog box, type a name, and then click OK.

4. A new item, which has the name defined in the Branch dialog box, will appear in Team Explorer.

5. Open the solution file within the new branched code and work as normal.
When the time comes to define a release, a branch of the code can be merged back into the master
branch of code ready to be deployed. For items where there is a conflict, a window appears that enables
you to accept or reject changes in code from either the master or the branch.
Merging Code in Team Explorer

To merge branch code back into the main code, perform the following steps:

1. In Team Explorer, right-click the solution or folder that is the branch, right-click and point to Branch
and Merging, and then click Merge.

2. In the Merge dialog box, select the branch to merge, and select the main code to be merged into in
a name, and then click OK.

3. Open the solution file for the main code to validate that the merge has been successful.
MCT USE ONLY. STUDENT USE PROHIBITED
4-16 Deploying BI Solutions

The release management in TFS is performed in the TFS web portal that can be accessed from Team
Explorer. This means that you can set a release definition that will contain the following information:
 The development objects that make up new releases.

 The environments in which the objects can be deployed.

 Any additional automation tasks that can be executed in each environment.

The settings are contained within a release definition that will be managed by a TFS build agent—a
service that is responsible for executing the release definition. To define an agent, go to the TFS web
portal, click Administer server, and then select the agent pool tab.

Build Management in Team Foundation Server


Build management is the process of compiling the
BI code and packaging it up as an executable that
can be deployed manually or automatically,
depending on how a build definition is configured.
You can manage builds by using the build icon in
Team Explorer as follows:

1. In Team Explorer, click Build.


2. In Builds, click New Build Definition.

When a screen opens up in Internet Explorer, you


will be able to create and manage builds. A
definition template screen is displayed, where you
can select a template on which to base the build management. You can either select one of the
predefined templates, or create a new custom template using the Empty Process option. When you
select the Empty Process option, you will see the following tabs:

Tasks. This is used to define the solution that will be part of the build and the build steps in a specified
build. Additional properties can be configured, including the platform, configuration and Visual Studio
version where the solution is targeted. You can also specify the source of the working folder that stores
the code; for example, Team Foundation Version Control, GitHub or Subversion.

Variables. You can add variables to the build that can be used within the project. By default, this will
include system variables such as BuildPlatform and TeamProject. You can also add user defined variables,
which are specific to the solution created, to the build management.

Triggers. This determines how a build is to be executed. Multiple triggers can be set up and configured
to execute in one of two ways:

 Continuous Integration—this is done on each check-in of a project.

 Schedule—this means you can define a schedule for a build.


Options. This defines the build definition name, and the pool where the build can be assigned, and helps
you to define timeouts in the build definition.
Retention. This sets out how long you should keep the build definition, based on a time period in days.
This can include retention policies for master and branches in a TFS project.

History. This retains a history of the builds, and has a Diff command that can look at the differences in
each build created.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-17

For a build to be queued, a build agent must be installed on the TFS server. When this is done and a build
definition is set, it can be managed in Team Explorer—this includes the ability to view the builds that are
queued or completed, edit the existing build definition, and carry out other tasks, including:

 Queue a build to start a build.

 Delete a build.

 Secure a build.

As an alternative, build automation can be provided by using MSBuild.exe and the scripts for the project.

Demonstration: Configuring a Build Agent


To deploy a build, a build agent must be set up to enable a build to be created and deployed. This
demonstration shows you how to configure a build agent in Team Foundation Server.

Demonstration Steps
Open the Builds Console in Internet Explorer

1. Start Visual Studio 2017.


2. In Team Explorer, click the home icon.

3. Under Project, click the Builds icon.

4. In the Builds window, click New Build Definition. Internet Explorer starts.
Download the Build Agent

1. In Internet Explorer, in the page menu bar, click Builds.

2. On the Build Definitions page, click +Agent.

3. In the Get Agent window, on the Windows tab, click Download.

4. In the Internet Explore notification bar, click Save drop-down arrow, and then click Save as.

5. In the Save As dialog box, browse to D:\Demofiles\Mod04 and click Save.


6. On the Windows desktop, click File Explorer and browse to D:\Demofiles\Mod04.

7. Right-click the agent zip file and click Extract All.


8. In the Extract compressed (Zipped) Folders dialog box, extract all files to the
D:\Demofiles\Mod04\Starter\agent folder.

9. Close Internet Explorer.

Install the Build Agent on the MIA-SQL

1. In File Explorer, move to the D:\Demofiles\Mod04\Starter\agent folder, right-click config.cmd and


then click Run As administrator.

2. In the User Account Control dialog box, click Yes.

3. At the Enter Server URL prompt, type the following text, and then press Enter:

http://mia-sql:8080/tfs

4. At the Enter authentication type prompt, press Enter to accept the default value of Integrated
authentication.
MCT USE ONLY. STUDENT USE PROHIBITED
4-18 Deploying BI Solutions

5. At the line Enter agent pool prompt, press Enter to accept the default value.

6. At the Enter agent name prompt, accept the default value (MIA-SQL), and press Enter.

7. At the Enter work folder prompt, type D:\Demofiles\Mod04\agent\_work, and then press Enter.

8. At the Enter run agent as service? prompt, type Y, and then press Enter.
9. At the Enter the user account to use for the service prompt, type AdventureWorks\ServiceAcct,
and then press Enter.

10. At the Enter Password for user account AdventureWorks\ServiceAcct prompt, type Pa55w.rd,
and then press Enter.

11. Verify that the configuration completes without any errors.

Rollback Strategies
You should not overlook the BI operation team’s
ability to roll back the BI code that is deployed,
should an error occur with the release and build of
the solution. In this situation, the team can
respond by performing one of the following three
steps to mitigate a failed deployment:
 If time permits, fix the issue and perform
another release. This approach can work in
situations where there is a long time window
to perform a deployment. For example, some
organizations may not operate at weekends,
so releases are performed on Friday night. In
such circumstances, the release manager might say that there would be enough time to fix an issue
and redeploy the solution. The risk is that, what seems to be a simple issue, might be a symptom of a
bigger problem. This would mean that there is not enough time to fix and deploy—in which case, the
next rollback option would still have to be used.

 Undo changes and redeploy a previous release. You can adopt this pragmatic approach to return
the code base of a BI solution to a known state. The downside is that no new functionality will be
deployed as the deployment is effectively postponed. This option should be used in scenarios where
there is a small time window for a deployment.

 Continue with a deployment and apply a hotfix. Some deployments may contain errors that do
not affect the functionality of key components of the BI solution, but do affect a discrete part. In such
scenarios, the BI operations team might apply a temporary fix for the deployed solution in an
environment, with the intention of retrospectively fixing the issue in the code for the next release. This
approach works well for anticipated or known errors in a deployment.

If a rollback is necessary, it is important to inform the data director or manager, so that the appropriate
communications can be sent out within the business.

Question: What rollback strategies do you need to employ if a production deployment fails?
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-19

Lab: Deploying BI Solutions


Scenario
Adventure Works Cycles is a global corporation that manufactures and sells bicycles and accessories. The
company sells through an international network of resellers, and has a direct sales channel through an e-
commerce website.

You are a consultant working with the BI operations team to improve the operational management of
their current BI solution. The BI team have requested that you demonstrate how databases can be
managed between different environments using DACPACs and BACPACs. They also want a demonstration
on how builds can be managed using Team Foundation Server. Finally, they want to look at the different
examples of managing deployments.

Objectives
At the end of this lab, you will show:

 How to create a DACPAC from SQL Server Management Studio and Visual Studio.
 How to create a build in Team Foundation Server.

 How to deploy a BI solution.

Estimated Time: 45 minutes


Virtual machine: 10988C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa55w.rd

Exercise 1: Creating a Stand-alone DACPAC


Scenario
You have been asked to demonstrate how to create a DACPAC from within both SQL Server Management
Studio and Visual Studio. First, you will remove unnecessary users that would prevent a migration. Next,
you will create a DACPAC of the EIM_Demo database. You will then show the BI operations team how
DACPACs can be created from within a database project in Visual Studio.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment


2. Create a DACPAC from SQL Server Management Studio

3. Deploying a Data Tier Application

 Task 1: Prepare the Lab Environment


1. Read the lab and exercise scenarios.
2. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

3. Run Setup.cmd in the D:\Labfiles\Lab04\Starter folder as Administrator.

 Task 2: Create a DACPAC from SQL Server Management Studio


1. Open SQL Server Management Studio.

2. Connect to the MIA-SQL SQL Server instance.


MCT USE ONLY. STUDENT USE PROHIBITED
4-20 Deploying BI Solutions

3. Delete the SalesReaders user from the EIM_Demo database.

4. Create a DACPAC from the EIM_Demo database.

5. Validate the existence of the DACPAC.

 Task 3: Deploying a Data Tier Application


1. Connect to the MIA-SQL\SQL2 SQL Server Instance.

2. Deploy the EIM_Demo_Test DACPAC to the MIA-SQL\SQL2 instance.

Results: At the end of this exercise, you will have:

Created a DACPAC using SQL Server Management Studio.

Deployed a Data Tier Application.

Created a DACPAC using Visual Studio.


Validated the creation of a Data Tier Application.

Exercise 2: Managing Builds in Team Foundation Server


Scenario
The BI operations team have asked you to demonstrate how to create build definition in Team
Foundation Server. This involves installing a build agent on the TFS server, then defining a build definition
that can be used to create a build.

The main tasks for this exercise are as follows:


1. Creating a Build Agent

2. Creating a Build Definition

 Task 1: Creating a Build Agent


1. Start Visual Studio 2017 and create a new build definition.
2. Use the Builds Console in Internet Explorer to download the build agent and save to
D:\Labfiles\Lab04\Starter\Agent.
3. Install the build agent on MIA-SQL for the http://mia-sql:8080/tfs server using the
AdventureWorks\ServiceAcct user account to run the build agent as a Windows service.

 Task 2: Creating a Build Definition


1. Create a build definition named AW_BI Build definition that uses Visual Studio to build the
AW_BI.sln solution from Team Foundation Server. (Use a command prompt task that runs the
"devenv.exe" program with the /Build flag)

2. Deploy and validate the build of the AW_BI solution.

Results: At the end of this exercise, you will have:

Installed a build agent.


Created a build definition.

Validated a build definition.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-21

Exercise 3: Exploring Deployment Methods


Scenario
The BI team have asked you to demonstrate the different deployment methods for deploying a BI
solution. You will show the team how manual deployments can be performed in SQL Server Management
Studio and Visual Studio. You will demonstrate an XMLA script deployment, and finally provide an
example of how to automate deployments of a BI component.

The main tasks for this exercise are as follows:

1. Manually Deploy a DACPAC

2. Deploying an Analysis Services Database using XMLA

3. Grant Remote Access to Integration Services

4. Automating the Deployment of a SSIS Package

5. Perform a Full Deployment from Visual Studio

 Task 1: Manually Deploy a DACPAC


1. Using SQL Server Management Studio, connect to the MIA-SQL SQL Server Instance and delete the
EIM_Demo database.

2. Deploy the EIM_Demo DACPAC to the MIA-SQL instance.

 Task 2: Deploying an Analysis Services Database using XMLA


1. Using Files Explorer, move to the C:\Program Files (x86)\Microsoft SQL
Server\140\Tools\Binn\ManagementStudio folder and start the SQL Server Analysis Services
Deployment Wizard (Microsoft.AnalysisServices.Deployment.exe)
2. Using the wizard, create an XMLA deployment script named AW_SSASScript.xmla based on the
AW_SSAS.database file located in D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI\AW_SSAS
folder. The target server should be MIA-SQL with a database named AW_SSAS.

3. Connect to the MIA-SQL instance of Analysis Services and execute the AW_SSASScript.xmla script.

 Task 3: Grant Remote Access to Integration Services


 Use Component Services to grant the Student account remote access permissions for Integration
Services.

 Task 4: Automating the Deployment of a SSIS Package


1. Create a batch file named CopySSISPackage.cmd that uses the dutil utility to copy the
EIM_Demo_DW_Load.dtsx file located in the
D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI\AW_SSIS folder to the MIA-SQL with the
name of EIM_Demo_DW_Load.
2. Execute the CopySSISPackage.cmd file.

3. Verify that the EIM_Demo_DW_Load package has been added to the MIA-SQL Integration Services
instance.

 Task 5: Perform a Full Deployment from Visual Studio


 Perform a full deployment of the AW_BI Solution from within Visual Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
4-22 Deploying BI Solutions

Results: At the end of this lab, you will have:

Manually deployed a DACPAC that has been part of a Team Foundation Server Build.

Used Visual Studio to manually deploy a Reporting Services project.

Execute an XMLA script from within SQL Server Management Studio.


Automate the deployment of a SSIS package.

Question: What conclusions can be drawn from the various methods of deployment that are
available?

Question: Who handles deployments to production servers within your organization?


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-23

Module Review and Takeaways


In this module, you have learned the importance of deploying BI solutions to nonproduction
environments in the first instance to practice and define the process for deploying a BI solution. You have
explored the different methods to distribute databases to other servers and environments by using
DACPACs or BACPACs. You have learned the importance of source control, and how you can use Team
Foundation Server to create and manage builds. You have learned the different options for deploying a BI
solution either manually or through scripts.
MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
5-1

Module 5
Logging and Monitoring in BI Operations
Contents:
Module Overview 5-1 
Lesson 1: The Need for Logging and Monitoring 5-2 

Lesson 2: Logging Options 5-6 

Lesson 3: Monitoring Options 5-14 


Lesson 4: Setting Up Alerts 5-26 

Lab: Monitoring BI Solutions 5-30 

Module Review and Takeaways 5-34 

Module Overview
The main aim of any operations team is to ensure the continued service of key applications that are used
within the business—more organizations are seeing a BI solution as a critical application for ensuring
success. Therefore, the BI operations team should implement a proactive approach to overseeing the
general health of the servers and services that are being used.

This will involve employing a number of technologies that can log the operations of a service to
proactively manage any potential problems that are identified. There will be times when the BI operations
team will have to be reactive, using monitoring tools to help identify the root cause of any potential
issues.

Objectives
After completing this module, you will be able to:
 Describe the need for logging and monitoring.

 Use various logging options.

 Use monitoring tools.


 Set up notifications.
MCT USE ONLY. STUDENT USE PROHIBITED
5-2 Logging and Monitoring in BI Operations

Lesson 1
The Need for Logging and Monitoring
Logging and monitoring plays an important role in BI operations management. SQL Server® provides a
variety of tools that you can use to retrieve information about the configuration and performance of your
SQL Server components.

Even in environments where rigorous standards and processes are applied to managing the BI
infrastructure, there will be situations where the BI operations team will need to investigate and identify
the root cause of a reported issue.

Using the available logging and monitoring tools, in addition to your understanding of how the
environment is configured, can provide an overall picture of the processes that are affecting your system
and will help you to solve problems efficiently. Some of the tools can also be used to identify areas that
can be improved.

Lesson Objectives
After completing this lesson, you will be able to describe:

 An overview of logging and monitoring.


 The different types of logging.

 The different types of monitoring.

 The importance of baselining.

Logging and Monitoring Overview


Logging is an activity that records events that
occur within an operating system, or a service such
as SQL Server. Some logging activities run
automatically when the operating system or a
service starts; other logging activities must be
manually configured before they can run. In an
age where findings or opinions must be backed by
data, the range of available logging options can
provide the evidence that will either eliminate or
incriminate an individual’s theory as to why a SQL
Server is encountering errors, or performing at a
substandard level.
You can use monitoring tools to pinpoint specific areas of Windows® or SQL Server that may be having
problems. You can also gain more in-depth information than that provided by logging. SQL Server offers
many tools that can return information about how the hardware, such as CPU and memory, is being
utilized by queries. It can also focus on areas of the SQL Server itself, such as transaction management and
locking. Typically, the use of monitoring tools can help a team to dig deeper into issues that have been
raised in the logs.
The information that is returned by either the logging or monitoring tool should be acted upon. You
could resolve the issue by using a fix. However, if a fix cannot be found or implemented, you could set up
an alert that would notify the BI operations team should the symptoms of the issue resurface. This would
provide a proactive approach to managing the issue until a solution is found.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-3

The BI operations team should, therefore, categorize the servers that they support, based on their
importance to the business. Mission critical servers that will cause business disruption could be
categorized as Tier 1 servers. You can set service level agreements to determine the amount of time that
Tier 1 servers can be offline, or the amount of time that is allowed for a fix to be put in place. Other
categories can then be defined for other types of servers. This will dictate the level of support, and the
amount of logging and monitoring that will be performed on each tier of servers.

If logging and monitoring is to be successful, it is important to establish a performance baseline—you can


then determine when an area of your SQL Server is not operating as expected.

Types of Logging
You can use a variety of tools to log events and
activities across many of the products in the
Windows and SQL Server stack. After a logging
option is set up, it will typically run constantly in
the background while the service is operational.
The intention is to collect information that logs
the activity that has been occurring against the
service; to log any errors or exceptions; or to log
the queries that are being executed against the
service. The logging options include:
 Windows logging. Windows Server®
provides event logs that enable you to review
a history of information regarding the system, its applications, and its security. You can use the event
log to see if the cause of an issue is related to Windows or to a specific application. Event logs begin
automatically when a Windows Server is started.

 SQL Server logging. SQL Server contains an error log that records the error so that you can
troubleshoot problems that are specific to SQL Server. The log file starts automatically at the same
time as the SQL Server instance. You can also configure how many log files are retained. In addition,
the SQL Server Agent that is typically used to automate the execution of BI tasks keeps a log of the
execution history.

 Integration Services logging. Integration Services provides a variety of options that enable you to
log standard events, or create custom logging events to monitor the progress of the execution of a
package. The types of logging available include:

o SSIS standard logging

o SSIS custom logging


o SSIS event handlers

 Analysis Services logging. Analysis Services provides extensive logging options that can record
many aspects of a server’s activities, including query performance and processing performance. The
logging options available to SSAS include:

o Query logging

o Error logging

o Exception logging

o Traces
MCT USE ONLY. STUDENT USE PROHIBITED
5-4 Logging and Monitoring in BI Operations

o Flight recorder

 Reporting Services logging. Reporting Services provides logging capability that means you can view
any setup issues, and track the execution of reports through user or scheduled activities. The available
options for logging include:

o Setup log files

o Report execution logs

o Report server HTTP log

o Report server service trace log

 Data Quality Services logging. Data Quality Services provides three types of log files for
troubleshooting any issues that may arise with the Data Quality Server, Data Quality Client, and the
DQS cleansing component found in SQL Server Integration Services.

 Master Data Services logging. The Master Data Services Web.config file contains a tracing section
that means you can capture the activity that is occurring on Master Data Services and store it in a log
file.
Logging is useful for providing information about the general health of the SQL Server. There are so many
options to logging the activity of SQL Server BI components that it is important for the BI operations team
to be selective. Certain logging options, such as event logs and SQL Server error logs, are mandatory and
cannot be turned off. Other types of logging have to be configured before they can be used. A decision
must be made on which logging should be set up to provide the team with valuable information to help
them debug potential problems. Another consideration is that logging can consume additional resources.
Therefore, it is important to ensure that the logging setup does not have an excessive impact on server
resources.

Types of Monitoring
Monitoring tools are typically executed in
response to information that is found through
logging, but the logging solution does not provide
enough detail with which to solve the issue. You
can use monitoring to capture more granular data
about a particular area of Windows or a SQL
Server. Monitoring will consume resources that
can have an impact on the operation of a server.
As a result, monitoring is performed as a reactive
measure to dig deeper into an issue. However, you
can run many of the monitoring tools proactively
to capture more information, if the server on
which it runs can handle the load of the monitoring tool. The following monitoring tools are available:

 Windows monitoring. Windows Server provides Windows Performance Monitor to deliver real-time
information about hardware, services, and various SQL Server components, including SSIS, SSAS, and
SSRS that are installed on the server. Windows monitoring can also be set up, so that you can capture
information during a defined, targeted time period.

 SQL Server monitoring. SQL Server provides a variety of monitoring tools, such as Extended Events
and SQL Server Profiler, that can be used by different SQL Server components. In addition, you can
use the Transact-SQL language to query information that is stored about the SQL Server activity by
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-5

using dynamic management views (DMVs). Activity Monitor can also provide real-time information
about the connections that are running against an instance of a SQL Server. You can also use
Execution Plans to monitor the performance of Transact-SQL queries that are running against the
database.
 Integration Services monitoring. Integration Services installs a number of objects within Windows
Performance Monitor to provide real-time or collected performance data. You can also query the SSIS
Catalog to monitor the real-time execution of packages on an SSIS server.

 Analysis Services monitoring. Analysis Services also installs a number of objects within Windows
Performance Monitor. In addition, Extended Events can be used to monitor the performance of cubes.
You can also use SQL Server Profiler to monitor an Analysis Services instance.

 Reporting Services monitoring. Reporting Services installs a number of objects within Windows
Performance Monitor to monitor the activity of the Report Server Web Service, and the Report Server
Windows Service.

 Data Quality Services monitoring. Data Quality Services provides an activity monitor in the Data
Quality Services client to display information about the activity occurring on the client.

 Master Data Services monitoring. Trace logging is used for both monitoring and logging.

Importance of Baselining
A proactive approach to a logging and monitoring
strategy is to collect enough information to
understand the operational behavior of your SQL
Servers and the resources that are being
consumed during different periods of the business
life cycle. This is known as establishing a baseline.
Understanding the baseline operational
performance of a SQL Server will help you make
informed decisions about particular activities that
occur.

The logging and monitoring solution should take


into account the different levels of activity that
occur during the separate stages of a defined time period. For example, there may be an expectation that
the Analysis Services memory consumption will increase when cubes are being processed. Another
example could involve the increase in concurrency against a report server during the business month end
when month end reports are being generated and consumed by users.
In establishing a baseline of the core SQL Server components, the Windows operating system and the
hardware that supports the SQL Server, you will better understand how the servers perform. You can then
make an informed decision about whether or not the SQL Server is operating at an optimal level.

You will need to implement the logging and monitoring options over the periods of time that best reflect
the business process, and collect the information from the tools to establish clear baselines. For example,
the data may inform you that it is normal for the SQL Server memory to peak at 90 percent consumption
overnight, due to the activity of loading a data warehouse. However, memory consumption that occurs
during nondata warehouse load periods would be deemed unacceptable.

Question: What tools do you use to baseline the performance of your BI technologies?
MCT USE ONLY. STUDENT USE PROHIBITED
5-6 Logging and Monitoring in BI Operations

Lesson 2
Logging Options
SQL Server provides many logging techniques to help you keep track of a wide range of areas that can
impact security, performance, and operations. You can also configure the tools to focus on specific events,
such as warnings and errors, or failed and successful events.

Different logging tools offer different benefits—some provide broad information about a technology,
whilst others focus on a specific aspect of a technology. Some of these tools provide unique benefits but
there are also limitations that the BI operations team should be aware of. An informed decision can then
be made about which tools to employ when logging the operations of a SQL Server BI estate.

Lesson Objectives
After completing this lesson, you will understand the options for:

 Windows logging.
 SQL Server logging.

 SQL Server Integration Services logging.

 SQL Server Analysis Services logging.


 SQL Server Reporting Services logging.

 Master Data Services and Data Quality Services logging.

Windows Logging
Windows event logs can provide useful
information regarding both the general health of
the SQL Server components and the Windows
operating system. It will log informational
messages, in addition to errors and warnings.
There is also a log to record security events that
occur on the system. The following three core log
areas would be of interest to the BI operations
team:

 System logs. Events that are logged by the


Windows system will appear in the log area as
informational, warning or error events.

 Application logs. SQL Server will log informational, warning, and error events in this log. Each event
that is logged contains a severity level—a severity level of 19 or above is typically logged with error
events and should be taken seriously by the BI operations team.

 Security logs. Security logs keep a record, or an audit, of security related events. These are typically
recorded as successful or failed events. The security log will not audit every single event. Usually,
additional configuration is required within Windows or SQL Server to make use of security logging.

When a SQL Server component is installed on a server, it will record general events to the application log.
The BI operations team can use the Event Viewer application to view, filter and manage the events that
are stored within the event logs. To access the Event Viewer, click the Windows Start button, type Event
Viewer, and then click the Event Viewer application.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-7

In the Event Viewer application, the logs described above will appear in the Windows Logs node. You can
right-click a log and go to its properties to configure settings, such as the size and log storage location for
the event log, in addition to the retention period for logged data. The log size and log retention setting is
an important consideration for the BI operations team. You need to configure these settings so that
enough retention of the log data is available should you wish to troubleshoot a situation. There could be
a scenario where a server is so active that the logged data could fill up very quickly, causing old logged
data to be removed.

You can also interact with the Event Viewer Log by right-clicking a log and choosing one of the following
options:

 Clear the log

 Filter the log

 Save the log data

 Search the log

Filtering can help the team focus on the specific items that exist within a log—for example, you could
filter by an application, an event type, or a date range. Being able to save a log means you can clear the
log, should you wish to record events from a particular starting point.
When an issue is first reported, the BI operations team should use the event logs found in Windows to see
the general health of both the Windows system and SQL Server. Serious issues for SQL Server will be
recorded in the application log, so looking there can provide the first clues as to where an underlying
problem is occurring.

SQL Server Logging


SQL Server Error Logging
SQL Server error logs continuously log information
about the processes, scripts and commands that
have occurred within the SQL Server since its last
startup, or a restart of the SQL Server Database
Engine services. This provides information
regarding the general health of the SQL Server
that can help you identify any problems that may
be occurring.

The SQL Server error logs can either be viewed in


SQL Server Management Studio or opened up
within a text editor.

Viewing SQL Server error logs

To view the SQL Server error logs in SQL Server Management Studio, you should perform the following
steps:

1. Open SQL Server Management Studio, and connect to the database engine instance.

2. In Object Explorer, expand the instance, expand Management, and then expand SQL Server Logs.

3. By default, seven log files appear in Object Explorer.

4. To view a log, right-click a log file, and then click View SQL Server Log.
MCT USE ONLY. STUDENT USE PROHIBITED
5-8 Logging and Monitoring in BI Operations

After a SQL Server log is opened, you can interact with the log file in a similar way to Windows event logs
by filtering the log data, exporting the data, and searching through the log data. If the team prefers, you
can work directly with the log data that, by default, is stored in the following location:

C:\Program Files\Microsoft SQL Server\MSSQL14.MSSQLSERVER\MSSQL\LOG

Seven files appear, the first named ERRORLOG, and subsequent files named ERRORLOG.n, where n is
equal to the number of the log file that is displayed in SQL Server Management Studio. Being able to view
the log files in a text editor is useful in an event where you cannot access SQL Server.
Like event logs, the BI operations team should use SQL Server error logs to provide an overview of the
health of SQL Server and its components. The BI operations team can configure the amount of error log
files that can be retained by the SQL Server by right-clicking SQL Server Logs in Management Studio and
clicking Configure. This means that the team can ensure that enough logging information is retained,
because high volume environments will cycle through the logs very quickly, and in extreme cases, may
only hold error information for the previous few hours. It might be necessary to configure this option so
that information can be retained for troubleshooting a SQL Server.
SQL Server Setup Logs

Log files are also created during setup. If setup fails or succeeds but shows warnings or other messages,
you can examine the log files to troubleshoot the problem. This applies to all SQL Server components that
are set up. Each execution of setup creates log files with a new time-stamped log folder at C:\Program
Files\Microsoft SQL Server\140\Setup Bootstrap\Log. The time-stamped log folder name format is
YYYYMMDD_hhmmss. This file can be opened in a text editor. It is typical to perform a search of the word
“error” to take you to the location within the file that contains any error messages.

SQL Server Integration Services Logging


To ensure that SSIS logging capability benefits the
BI operations team, you use the SSIS Catalog that
is available when SSIS packages are deployed to
SQL Server. The SSIS Catalog provides information
about projects, packages, parameters,
environments, and operational history.
Setting the logging levels within an SSIS
package
The required logging level can be set before a SSIS
package is executed. This is achieved by
configuring the logging settings in the Advanced
tab of a package that is being executed as shown
in the following steps:

1. In SQL Server Management Studio, connect to an Integration Services instance.

2. Navigate to a package in Object Explorer.

3. Right-click the package and select Execute.

4. Select the Advanced tab in the Execute Package dialog box.

5. Under Logging level, select the logging level, and then click OK to execute.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-9

The available options when logging a SSIS package include:

 None. Logging is turned off during the execution of a package.

 Basic. The default value that logs all information except for diagnostic information.

 RuntimeLineage. Collects information that tracks the lineage of the data as it runs through the
different tasks in a package execution.
 Performance. Only performance statistics about the package are collected against warning and error
events.

 Verbose. All events, including diagnostic events about the package execution, are collected.

SQL Server also enables you to create customized logging levels that collect only the statistics and events
that you want. When you run a package, you can select a customized logging level wherever you can
select a built-in logging level.

Creating a Custom Logging Level in SSIS

To create a custom logging level, you should perform the following steps:
1. In SQL Server Management Studio, right-click the SSISDB database and select Customized Logging
Level.

2. To create a new customized logging level, click Create, and then provide a name and description.

3. On the Statistics and Events tabs, select the statistics and events that you want to collect.
4. On the Events tab, optionally select Include Context for individual events, then click Save.

The new execution logging level can be selected the next time a package is executed.

The information that is collected from the package execution is stored in tables within the SSISDB Catalog.
Views that are stored in the database can be used to return information about the execution of packages,
event statistics, and the project settings and environments that are used. This means you can build queries
that return information about the execution and performance of a package.

In SQL Server, the ssis_logreader database role has the ability to read the log data in the catalog. In
prrevious versions of SQL Server, this could only be done by the ssis_admin role. However, the permissions
associated with this role are considered too great for users who are just required to read the log data in
the SSIS Catalog.

If package execution is started from the SQL Server Agent, you can use the information from the SQL
Server Agent history, along with the SSISDB views, to provide an evidenced-based picture of how the
packages are performing.
MCT USE ONLY. STUDENT USE PROHIBITED
5-10 Logging and Monitoring in BI Operations

SQL Server Analysis Services Logging


SQL Server Analysis Services provides plenty of
logging capabilities with which you can view and
record a wide range of activities that take place on
the Analysis Server. The configuration of the
logging capability occurs in the Analysis Services
instance property on the General tab. In this
dialog box, you can define precisely what should
be logged, enabling the BI operations team to
have granular control of the information that is
collected and stored.

The most common logging options that are used


include:

o Query Log. The query log records information about the queries used against the Analysis
Server. The query log does not capture the full MDX query that is sent to the server. Instead, it
captures a numeric list of the hierarchies and attributes used in each dimension, such as
01,00000010200000,100,00,100,12000. Each comma separates the level numbers between
dimensions. The server can use this list to find out which hierarchies were accessed and at what
level, so it can optimize its aggregates without having the details of the query. This information
can be stored in a SQL Server table and/or a file.

o The properties specific to the query log include:


 QueryLog \ QueryLogFileName. Specifies the name of the query log file. This property only
applies when a disk file is used for logging.
 QueryLog \ QueryLogSampling. Specifies the query log sampling rate. The default value
for this property is 10, meaning that one out of every 10 server queries is logged.
 QueryLog \ QueryLogFileSize. An advanced property that you should not change, except
under the guidance of Microsoft support.
 QueryLog \ QueryLogConnectionString. Specifies the connection to the query log
database.
 QueryLog \ QueryLogTableName. Specifies the name of the query log table. The default
value for this property is OlapQueryLog.
 QueryLog \ CreateQueryLogTable. A Boolean property that specifies whether to create the
query log table. The default value for this property is false, which indicates that the server will
not automatically create the log table and will not log query events.
o Error Logging. The events to log for errors are configured in the General page of the instance
properties of the Analysis Server. The settings in this area define the default values that will be
used by tools and applications that use Analysis Services. The first property to configure is the
LogDir property, which defines the location of where the log file should be stored, followed by
the Log\File property. This defines the name of the file that, by default, is msmdsrv.log.

Additional Analysis Services logging capabilities include traces, exception logging, and the flight recorder.
However, it is recommended that these methods of logging should only be used after taking advice from
Microsoft.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-11

SQL Server Reporting Services Logging


Reporting Services offers a variety of logging
options that provide the capability to review the
setup of Reporting Services, the interactions that
occur within Reporting Services, and the reports
that are executed.

Report Server HTTP Logging


Every HTTP request that is handled by the
Reporting Services web service can be logged. If
you want to view the level of interaction that is
occurring with Reporting Services Web Services,
this type of logging can provide information such
as the date, time, client and username of the
connection made to the web service.

Logging is turned off by default, but can be enabled by including the following code in the
ReportingServicesService.exe.config file:

Enabling HTTP logging in the ReportingServicesService.exe.config file


<RStrace>
<add name="FileName" value="ReportServerService_" />
<add name="FileSizeLimitMb" value="32" />
<add name="KeepFilesForDays" value="14" />
<add name="Prefix" value="tid, time" />
<add name="TraceListeners" value="debugwindow, file" />
<add name="TraceFileMode" value="unique" />
<add name="HttpTraceFileName" value="ReportServerService_HTTP_" />
<add name="HttpTraceSwitches" value="date,time,
clientip,username,serverip,serverport,host,method,uristem,uriquery,protocolstatus,bytesre
ceived,timetaken,protocolversion,useragent,cookiereceived,cookiesent,referrer" />
<add name="Components" value="all:3,http:4" />
</RStrace>

The configuration file is stored in the C:\Program Files\Microsoft SQL Server Reporting
Services\SSRS\ReportServer\bin folder and can be opened in a text editor.

Execution Logging

Information is recorded in the ReportServer database when a report is accessed and executed. Reporting
Services provides three views with the ReportServer database—you can view the times for retrieving,
processing, and rendering a report. It also provides information about the user who accessed the report
and when the report was accessed.

The ReportExecutionLog views can provide insightful information regarding the performance of reports,
and in particular, which aspect of the report retrieval is the slowest. The BI operations team can use these
views to identify problem reports through simple querying—the team can also identify which reports are
not being accessed.

Reporting Services Trace Files

The Reporting Services report server trace log file provides verbose information about the Report Server
web service, Report Manager, and the Reporting Services Windows service.
MCT USE ONLY. STUDENT USE PROHIBITED
5-12 Logging and Monitoring in BI Operations

The log file must be enabled by adding the following section above the RSTrace section in the
ReportingServicesService,exe.config file:

Enabling Reporting Services Tracing


<system.diagnostics>
<switches>
<add name="DefaultTraceSwitch" value="3" />
</switches>
</system.diagnostics>

A number of values can be used in the values property to determine the level of logging that would be
used, including:

0: Disables tracing.

1: Logs exceptions and restarts.

2: Logs exceptions, restarts, warnings.

3: Logs exceptions, restarts, warnings, status messages (default).

4: Logs verbose information.


This will add the additional logging to the same file that has been defined for the Reporting Services HTTP
log file.

Master Data Services and Data Quality Services Logging


Master Data Services
You have the ability to enable tracing of the
Master Data Services application. This means that
you can record the operations that have been
executing against the Master Data Service. The
web.config file that is used for Master Data
Services can be used to define the level of tracing
that will be performed by modifying the value for
the property switchvalue. The following list
provides the valid values that can be used:

 Off: tracing is disabled.

 Error: errors only.

 Warning: errors and warnings.

 Information: errors, warnings, and informational messages.


 Verbose: includes Information, and additional debugging trace information, including API requests.

 ActivityTracing: start and stop events only.

 All: includes verbose and ActivityTracing information.

Additional settings can be configured in the web.config file, including filename and path, which
determines the location and filename of the trace file. After configuration, you can read the file in a text
editor.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-13

Data Quality Services

Data Quality Services includes log files for the Data Quality Server, the DQS client and the Data Cleansing
component that is used within SSIS.

The Data Quality Server log file is named DQServerLog.DQS_MAIN.log and, by default, is located in
C:\Program Files\Microsoft SQL Server\MSSQL14.MSSQLSERVER\MSSQL\Log. The Data Quality Client log
file is named DQClientLog.log and, by default, is located in %APPDATA%\SSDQS\Log. The Data Quality
Client log file contains similar information as the server log file, but from a client-side perspective. The
DQS Cleansing component log file is named DQSSSISLog.log and, by default, is located in
%APPDATA%\SSDQS\Log. All files can be opened in a text editor.

All of the log file sizes can be configured in the Advanced settings of Data Quality Services Client. They are
rolling files, with a new log file created when the existing log file exceeds the specified size limit.

Question: Which logging options do you use in your environments—and why?


MCT USE ONLY. STUDENT USE PROHIBITED
5-14 Logging and Monitoring in BI Operations

Lesson 3
Monitoring Options
By adopting a correctly executed monitoring strategy, you can further investigate any issues that are
impacting a BI solution but have only been partially identified or resolved by the logging approach. With
monitoring, you can narrow the search to a specific problem area by using one of the many tools that are
provided by SQL Server and Windows.
Monitoring is typically used on a case-by-case basis to further assist in the identification of an issue. This
means you can troubleshoot an issue further with a view to better understanding the root cause of a
problem. By understanding the capabilities of the tools that are available, you will pick the right one for
the job.

Lesson Objectives
After completing this lesson, you will be able to monitor:
 The operating system.

 The SQL Server Database Engine.

 The SQL Server Agent.


 SQL Server BI components.

Monitoring the Operating System


It is important that you monitor the health of the
operating system where SQL Server is hosted, to
ensure that is does not prove a bottleneck for SQL
Server performance. A number of tools can be
used to monitor the Windows system, including:

 Task manager. Task manager will provide the


BI operations team with a snapshot of the
processes that are running on a server and the
performance of the server at the time that it is
accessed. Before you start a monitoring
activity, it is typical to check performance
within task manager to ensure that the server
is not under too much pressure.

 Windows Performance Monitor. You can use Windows Performance Monitor for a broad
monitoring solution that encompasses Microsoft Windows, SQL Server and the hardware that hosts
these components. Windows Performance Monitor can be used to provide real-time information
about hardware, services, and components on a physical server. It consists of objects that describe a
component or area of the operating system, such as the CPU or the memory object. When an
instance of SQL Server is installed, the installation will add objects to Windows Performance Monitor,
such as SQL Server: Databases and SQL Server: Transactions. Each object contains counters that will
measure a specific part of the object. For example, in the memory object, the available MBytes
counter can be used to monitor the amount of free space in the memory. In the SQL Server:
Databases object, the Percent Log Used counter can indicate how full a transaction log file is. Some
counters can also be broken down into instances. For example, the Percent Log Used counter can
select an instance of a database that you wish to monitor—such as the AdventureWorks database.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-15

You can also create a data collector set that will store the real-time performance data from Windows
Performance Monitor in a file for later review.

Monitoring the SQL Server Database Engine


The data warehouse provides the source of the
information that feeds other BI systems
downstream, such as Analysis Services or
Reporting Services. As it is hosted in the SQL
Server Database Engine, a number of real-time
and persisted monitoring options are available
within the SQL Server to check the health of the
database engine, including:

Activity Monitor
Activity Monitor provides real-time information
about the connections and processes against a
SQL Server. It can also provide information on the
most recent expensive queries. The data cannot be persisted and should only be used to confirm real-time
information about connections. You can also use Activity Monitor to kill connections against the instance.
To access Activity Monitor, right-click the instance in SQL Server Management Studio, and then click
Activity Monitor.

SQL Server Profiler

SQL Server Profiler is a graphical tool you use to monitor many aspects of SQL Server, including
transactions and stored procedures. You can use SQL Server Profiler to monitor these components in real
time—this is known as a trace. However, you typically store the results of a profiler trace in a file or table
for later review.
When creating a profiler trace, you must first set up the general properties of the trace, specifying a trace
name and whether or not to save the trace information to a trace file, or to a SQL table. Predefined
templates are available that contain some predefined events to monitor. To view all the available events,
select the blank template. A trace stop time can also be defined in the general properties.
Should you choose a blank template, you can use the Events tab to define the events that you want to
record. These events are organized into categories that represent a component of SQL Server. The trace
file can be stopped manually at any time. After the trace has been created, it can be replayed and
reviewed in SQL Server Profiler.

Typically, SQL Server Profiler is used for the database engine to monitor the queries that are executing
against the instance of the SQL Server. The following events could be profiled:

Category Event Description

Stored SP: Starting Shows when a stored procedure has started


Procedures executing.

Stored SP: StmtStarting Shows when a statement within a stored procedure is


Procedures executing.

Stored SP: CacheHit Shows when a stored procedure is using an


Procedures execution plan stored in memory.
MCT USE ONLY. STUDENT USE PROHIBITED
5-16 Logging and Monitoring in BI Operations

Category Event Description

TSQL SQL: StmtStarting Shows when a one-off Transact-SQL statement is


executing.

Transactions TM: Begin Tran Shows when a transaction has started.


Starting

Transactions TM: Commit Tran Shows when a transaction is being committed.


Starting

Transactions TM: Rollback Tran Shows when a transaction is rolling back.


Starting

Sessions Existing Connection Shows the properties of a connection.

Scans Scan: Started Shows when a table or index scan is being


performed.

Locks Lock: Timeout Shows if there are locking timeouts of queries.

You can use the results of a trace file with other SQL Server tools, such as the Database Engine Tuning
Advisor. You can also integrate system monitor files within Profiler to correlate information between the
two tools if they have been running at the same time.

Transact-SQL Queries
You can use a range of Transact-SQL queries against system tables or DMVs to return useful information
to a BI operations team.

DMVs can be used to monitor the health and performance of a SQL Server. When SQL Server is started,
telemetry regarding its operations is stored within system tables. The BI operations team can query the
information stored by using DMVs that can be viewed in the View node of SQL Server Management
Studio—the DMV view name begins with sys.dm_*. If the SQL Server has been running for a long period
of time, the information that is captured in DMVs can be extremely valuable when you are trying to
establish the overall health of the SQL Server. When a SQL Server is restarted, the contents are purged.

The first question that should be asked of SQL Server before even querying the DMVs is:

Determining Server Uptime


DECLARE @StartTime DATETIME
DECLARE @CurrentTime DATETIME
DECLARE @ServerUptime INT

SET @StartTime = (SELECT create_date from sys.databases where name = 'tempdb')


SET @CurrentTime = GETDATE()
SET @ServerUptime = (SELECT DATEDIFF( Day, @StartTime, @CurrentTime))

SELECT 'Uptime in Days: ' + CAST(@ServerUptime AS varchar(20)) AS [Server Uptime]

The result of the query will return the number of days the server has been online, and it will tell the BI
operations team just how useful the information will be. The longer the uptime of the server, the more
useful the information is, because the server will have been through a range of business cycles, including
month-end and quarter-end periods.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-17

Even if the server uptime is low, you could use the following system query to determine the amount of
tempdb data file and CPU cores that are configured in the instance, by using both system tables and
DMVs:

Determining the number of tempdb data files and CPU cores


SELECT COUNT(*) AS [Number of TempDb Data Files] FROM sys.master_files WHERE database_id
= 2 and type_desc = 'ROWS'

SELECT cpu_count AS [Number of CPUs] FROM sys.dm_os_sys_info

Using Transact-SQL queries to answer the questions that the BI operations team will be asking when there
is an issue, provides the data-driven approach required to troubleshoot problems with a BI solution.

Monitoring the SQL Server Agent


Automating the execution of BI tasks is typically
performed by the SQL Server Agent.

The SQL Server Agent is a separate component of


SQL Server that is used to automate jobs, and send
alerts and notifications to a defined user or
application. When a job is executed within the SQL
Server Agent, its history is maintained so that an
administrator can review the outcome. This can
prove useful in scenarios where a BI job has failed
and requires attention.

Viewing SQL Server Agent History

To view the history of a job, you should perform the following steps:

1. Open SQL Server Management Studio, and connect to the database engine instance.

2. In Object Explorer, expand the SQL Server Agent, and then expand Jobs.

3. To view history, right-click a job, and then click View History.


You can interact with the history by filtering the history data, exporting the data, and searching through
the history data. You can also configure the retention of the history data by configuring the properties of
the SQL Server Agent on the history page.

It is not uncommon for SQL Server integration tasks to be scheduled using the SQL Server Agent. In
addition, Reporting Services uses the SQL Server Agent to execute scheduled snapshots, caches and
subscriptions. Therefore, understanding how to view the history of a SQL Server Agent job is important
when you are solving execution issues that are reported to the BI operations team.
MCT USE ONLY. STUDENT USE PROHIBITED
5-18 Logging and Monitoring in BI Operations

Monitoring SQL Server BI Components


Windows Performance Monitor

Windows Performance Monitor contains many


SQL Server and Windows objects that can be used
to monitor the database engine, Integration
Services, Analysis Services, and Reporting Services.
Common counters that can be used to monitor a
BI environment are listed in the table below:

Object Counter Instance Description

System Processor N/A Indicates how many threads are waiting for execution
queue against the processor. If this counter is consistently
length higher than around 5 when processor utilization
approaches 100 percent, this is a good indication that
there is more work (active threads) available (ready for
execution) than the machine's processors are able to
handle.

System Context N/A Measures how frequently the processor has to switch
switches/sec from user to kernel mode to handle a request from a
thread running in user mode. The heavier the workload
running on your machine, the higher this counter will
generally be but, in the long term, the value of this
counter should remain fairly constant. However, if this
counter suddenly starts increasing, it may be an
indication of a malfunctioning device, especially if the
Processor\Interrupts/sec\(_Total) counter on your
machine shows a similar, unexplained increase.

Process % processor Sqlservr Definitely should be used if Processor\% Processor


time msmdsrv Time\(_Total) is peaking at 100 percent, to assess the
effect of the SQL Server process on the processor.

Process Working set Sqlservr If the Memory\Available bytes counter is decreasing, it


msmdsrv can be run to indicate if the process is consuming
increasingly large amounts of RAM.
Process(instance)\Working Set measures the size of the
working set for each process. This indicates the number
of allocated pages the process can address without
generating a page fault.

Processor % processor _Total Measures the total utilization of your processor by all
time and running processes. If you have a multiprocessor system,
individua you should know that only an average is provided.
l cores
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-19

Object Counter Instance Description

Processor % privileged _Total Used to see how the OS is handling basic IO requests.
time If kernel mode utilization is high, it is likely that your
machine is underpowered—it is too busy handling
basic OS housekeeping functions to effectively run
other applications.

Processor % user time _Total To see how an application is interacting from a


processor perspective, a high percentage utilization
determines that the server is dealing with too many
apps and might require an increase in hardware or
scaling out.

Processor Interrupts/se _Total The average rate, in incidents per second, at which the
c processor received and serviced hardware interrupts.
Should be consistent over time but a sudden
unexplained increase could indicate a device
malfunction that can be confirmed using the
System\Context Switches/sec counter.

Memory Pages/sec N/A Indicates the rate at which pages are read from or
written to disk to resolve hard page faults. This counter
is a primary indicator of the kinds of faults that cause
system-wide delays, and is the primary counter to
watch for any indication of possible insufficient RAM to
meet your server's needs. A good idea here is to
configure a perfmon alert that triggers when the
number of pages per second exceeds 50 per paging
disk on your system. You may also want to see the
configuration of the page file on the server.

Memory Available N/A This is the amount of physical memory, in bytes,


Megabytes available to processes running on the computer. If this
counter is greater than 10 percent of the actual RAM in
your machine, then you probably have more than
enough RAM. Monitor it regularly to see if any
downward trend develops, and set an alert to trigger if
it drops below 2 percent of the installed RAM.

Physical Disk Disk For each If it goes above 10 disk I/Os per second, you have a
transfers/sec physical poor response time for your disk.
disk

Physical Disk Idle time For each If the number of disk transfers per second is above 25
physical disk I/Os per second, you should use this counter. It
disk measures the percentage of time that your hard disk is
idle during the measurement interval—if you see this
counter fall below 20 percent, it is likely that you have
read/write requests queuing up for your disk, which is
unable to service these requests in a timely fashion.

Physical Disk Disk queue For SQL A value that is consistently less than 2 means that the
length Server disk system is handling the I/O requests against the
and physical disk.
Analysis
Services
Disks
MCT USE ONLY. STUDENT USE PROHIBITED
5-20 Logging and Monitoring in BI Operations

Object Counter Instance Description

Network Current For each This is an estimate of the current bandwidth of the
bandwidth network network interface in bits per second (bps).
card

MSAS 2016: Memory N/A Shows (as a percentage) the high memory limit
Memory limit high KB configured for SSAS in C:\Program Files\Microsoft SQL
Server\MSAS13.MSSQLSERVER\OLAP\Config\msmdsrv.i
ni.

MSAS 2016: Memory N/A Shows (as a percentage) the low memory limit
Memory limit low KB configured for SSAS in C:\Program Files\Microsoft SQL
Server\MSAS13.MSSQLSERVER\OLAP\Config\msmdsrv.i
ni.

MSAS 2016: Memory N/A Displays the memory usage of the server process.
Memory usage KB

MSAS 2016: File store KB  N/A Displays the amount of memory that is reserved for the
Memory cache. Note that, if total memory limit in the
msmdsrv.ini is set to 0, no memory is reserved for the
cache.

MSAS 2016: Queries from N/A Displays the rate of queries answered directly from the
Storage cache cache.
Engine direct/sec 
Query

MSAS 2016: Queries from N/A Displays the rate of queries answered by filtering an
Storage cache existing cache entry.
Engine filtered /sec
Query

MSAS 2016: Queries from N/A Displays the rate of queries answered from files.
Storage file/sec
Engine
Query

MSAS 2016: Average N/A Displays the average time of a query.


Storage time/query
Engine
Query

MSAS 2016: Current N/A Displays the number of connections against the SSAS
Connection connections instance.

MSAS 2016: Requests/sec N/A Displays the rate of query requests per second.
Connection

MSAS 2016: Current lock N/A Displays the number of connections waiting on a lock.
Locks waits

MSAS 2016: Query pool N/A The number of queries in the job queue.
Threads job queue
Length
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-21

Object Counter Instance Description

MSAS Temp file N/A Shows the number of bytes of data processed in a
2016:Proc bytes temporary file.
Aggregation written/sec
s

SQL Server: Buffer N/A The amount of memory that is used by data flow tasks
SSIS pipeline memory that are executing at the time of monitoring.

SQL Server: Buffers N/A The amount of memory that is written to disk.
SSIS pipeline spooled

SQL Server: Rows read N/A The total number of rows that are used as inputs into
SSIS pipeline SSIS data flows.

SQL Server: Rows written N/A The total number of rows that are outputs of the SSIS
SSIS pipeline data flows.

SQL Server: SSIS package N/A The total number of SSIS packages executing.
SSIS service instances

Report Memory N/A Reports the current pressure that the memory is under,
Server pressure with the following settings returned:
Service state 1: no pressure
2: low pressure
3: medium pressure
4: high pressure
5: exceeded pressure

Report Bytes N/A The amount of data per second received.


Server received/sec
Service

Report Bytes N/A The amount of data per second sent.


Server sent/sec
Service

SQL Server: Buffer cache Instance The percentage of data that was read from memory,
Buffer hit ratio rather than the disk.
Manager

SQL Server: Page life Instance The average length of time that data will remain in
Buffer expectancy memory before being ejected. A low value can indicate
Manager excessive memory pressure.

To use Performance Monitor, you should perform the following steps:

1. Click Start, type Performance, and then click Performance Monitor.

2. In the Performance Monitor console, on the left pane, ensure that Performance is selected.

3. Under Monitoring Tools, click Performance Monitor.

4. On the right pane, click the Add button.


MCT USE ONLY. STUDENT USE PROHIBITED
5-22 Logging and Monitoring in BI Operations

5. In the Add Counters dialog box, in the Select counters from computer list, ensure that <Local
computer> is selected.
6. Scroll through the counters list, expand the Memory node, click Available MBytes, counter and then
click Add, and then click OK.

7. On the right pane, observe the graph.

Profiler

Profiler can be used to monitor either the database engine or Analysis Services. It can be used to perform
real-time monitoring of the component in question or, alternatively, you can save the real-time
monitoring to a trace file for later review. Running SQL Server profiles is extremely resource intensive and
should only be used when the server in question can handle the load. However, the data that is returned
could help you to pinpoint the cause of an issue on a server, if the correct activities to monitor are chosen.
To run SQL Server Profiler, click Start on the Windows desktop and type SQL Server Profiler.

When you are profiling Analysis Services, the monitoring of the following events can help you to
troubleshoot query and processing performance:

Category Event Description

Server Errors Shows when error messages are returned by the Analysis Server.

Queries Query begin Shows when a query begins.


Event

Query Execute MDX Shows when an MDX query starts to execute.


Processing script begin

Query Query cube Shows the individual requests for data that can be used within
Processing begin the Usage Based Optimization.

Query Get data Shows the queries that are returning results from the
Processing from aggregations stored in the cubes.
aggregation

Query Query Shows the individual requests for data in a more readable
Processing subcube format.
verbose

Query DAX query Shows the query plan information for queries in tabular data
Processing plan models.

Query Direct query Shows when direct query mode is being used in tabular data
Processing begin models.

Query Vertipaq SE Shows the individual requests for data against a tabular data
Processing query cache model.
match

Creating a SQL Server Profiler Template


You can use SQL Server Profiler to create a profiler template for reuse. To create a template, you should
perform the following steps:

1. On the Windows desktop, click Start, type Profiler, and then click SQL Server Profiler.

2. On the menu bar, click File, point to Templates, and then click New Template.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-23

3. In the Trace Template Properties dialog box, next to server type, click the drop-down, and select
SQL Server 2016 Analysis Services.
4. Next to name, type a name for the template.

5. Click the Events Selection tab, and configure the events to include in the template.

6. Click the Save button.

SSIS Catalog

From Microsoft CodePlex, you can download SSIS reporting packs that contain prebuilt reports that query
the SSIS Catalog to provide operational information about package execution. The SSIS reporting pack
can be found at https://ssisreportingpack.codeplex.com/.

Reporting Services

Reporting Services provides execution logs that are covered in the next module.

Demonstration: Using Activity Monitor


Activity Monitor provides real-time information regarding the connections and processes against a SQL
Server. In this demonstration, you will explore the options available.

Demonstration Steps
1. Start Microsoft SQL Server Management Studio.
2. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.
3. To open the Activity Monitor, in Object Explorer, right-click MIA-SQL, and then click Activity
Monitor.

4. On the MIA-SQL - Activity Monitor tab, click Processes.

5. To open the Resource Waits section, click Resource Waits.

6. To open the Data File I/O section, click Data File I/O.

7. To open the Recent Expensive Queries section, click Recent Expensive Queries.

8. To change the refresh interval, right-click anywhere in the Overview section, point to Refresh interval,
and then click 1 second.

9. Close SQL Server Management Studio without saving any changes.

Demonstration: Creating a Data Collector


Microsoft Windows Server provides a data collector that you can use to log information about the
components that are installed on a Windows system with Performance Monitor. In this demonstration,
you will see how to create a data collector for general SQL Server monitoring.

Demonstration Steps
1. Click Start, type Performance, and then click Performance Monitor.
2. To view the list of data collector sets, in the Performance Monitor window, on the left pane, click
Data Collector Sets.
MCT USE ONLY. STUDENT USE PROHIBITED
5-24 Logging and Monitoring in BI Operations

3. To create a new data collector set, expand the Data Collector Sets node, right-click User Defined,
point to New, and then click Data Collector Set.
4. In the Create New Data Collector Set wizard, on the How would you like to create this new data
collector set? page, in the Name box, type a name such as SQL Monitoring.

5. Select the Create manually (Advanced) option and click Next.

6. On the What type of data do you want to include? page, select the Performance counter check
box, and then click Next.

7. On the Which performance counters would you like to log? page, click Add.
8. In the dialog box, in the Available counters section, expand the Processor node, scroll down, click
%Processor Time, and then click Add.

9. Scroll up and expand the PhysicalDisk node, scroll down and click Avg. Disk Queue Length, and
then click Add.

10. Scroll up and expand the Memory node, scroll down and click Available MBytes, and then click
Add.
11. Scroll down and expand the SQLServer:Databases node, click Active Transactions, and then click
Add.

12. Click OK.


13. On the Which performance counters would you like to log? page, click Next.

14. On the Where would you like the data to be saved? page, click Next.

15. On the Create the data collector set? page, ensure that Save and close is selected, and then click
Finish.

16. In the Performance Monitor window, on the right pane, right-click SQL Monitoring, and then click
Start.

Demonstration: Using SQL Server Profiler


SQL Server Profiler can provide a rich set of information that specifically targets problems and gives in-
depth information. In this demonstration, you will see how to use SQL Server Profiler.

Demonstration Steps
1. On the taskbar, click the File Explorer shortcut.
2. View the contents of the D:\Demofiles\Mod05 folder.

3. Right-click Setup.cmd, and then click Run as administrator.

4. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
5. Click Start, type SQL Server Profiler, and then click SQL Server Profiler 17.

6. In the SQL Server Profiler window, on the File menu, click New Trace.
7. In the Connect to Server dialog box, in the server type, select Database Engine. In the Server name
list, ensure that MIA-SQL is selected, and then click Connect.

8. In the Trace Properties dialog box, in the Trace name box, select the text, and type
QueryMonitoring.trc.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-25

9. In the Use the template list, select Blank.

10. In the Trace Properties dialog box, select the Save to file check box.

11. In the Save As dialog box, browse to D:\Demofiles\Mod05.

12. In File name, type QueryMonitoring.trc, and then click Save.

13. In the Confirm Save As dialog box, click Yes.

14. On the Events Selection tab, expand the TSQL node, scroll down, and select the SQL:BatchStarting,
SQL:BatchCompleted, SQL:StmtStarting, and the SQL:StmtCompleted check boxes.

15. In the DatabaseID column, clear the SQL:BatchStarting, SQL:BatchCompleted, SQL:StmtStarting,


and the SQL:StmtCompleted check boxes, and then click Column Filters.

16. In the Edit Filter dialog box, in the list, click DatabaseName.

17. Expand the Not like node, type Master, and then click OK.

18. In the Trace Properties dialog box, click Organize Columns.

19. In the Organize Columns dialog box, in the list, click DatabaseName, and then click Up until
DatabaseName appears under the Groups node, and then click OK.

20. To run the trace, in the Trace Properties dialog box, click Run.

21. In the QueryMonitoring.trc (MIA-SQL) window, expand the (1) node.

22. Start Microsoft SQL Server Management Studio.

23. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.

24. In Object Explorer, expand the Databases node, expand the AdventureWorks node, and then
expand the Tables node.

25. Right-click Person.Address, and then click Select Top 1000 Rows.

26. Right-click HumanResources.Employee, and then click Select Top 1000 Rows.
27. Right-click HumanResources.JobCandidate, and then click Select Top 1000 Rows.

28. Right-click Person.Person, and click Select Top 1000 Rows.

29. Right-click Production.BillOfMaterials, and then click Edit Top 200 Rows.

30. Additional tables may be selected to provide more information to SQL Server Profiler.

31. Close SQL Server Management Studio.

32. Return to SQL Server Profiles, expand the AdventureWorks database name, and review traces for the
work performed.

33. In the toolbar, click Stop Selected Trace, and then close SQL Server Profiler.

Question: Are there any barriers to you running monitoring tools on production servers?
MCT USE ONLY. STUDENT USE PROHIBITED
5-26 Logging and Monitoring in BI Operations

Lesson 4
Setting Up Alerts
You can use SQL Server to create and configure alerts. Alerts enable you to define a condition, and when
the condition is met, an action will be performed. This can either involve alerting members of the BI
operations team about errors with a particular component, or invoking a job that could remediate the
error that is occurring. Using alerts is a prudent way to provide a first response to an issue—the
operations team should create alerts that will help them to respond immediately to any issues that arise.

Lesson Objectives
After completing this lesson, you will have created:

 SQL Server Agent Operators.

 SQL Server Agent alerts.

SQL Server Agent Operators


Many BI activities will take place overnight. During
this time, it is typical for SSIS to be used to load
data into a data warehouse. Analysis Services will
process the Multidimensional or Tabular data
models, and Reporting Services will perform
scheduled report delivery. The BI operations team
can be notified of any issues that occur in this
process, if they are defined within the SQL Server
Agent—and the overnight jobs are initiated by
SQL Server Agent jobs.
In SQL Server Agent, an operator is defined as an
object that is an alias for an email message, or
pager addresses. If you want to receive notifications on SQL Server jobs and alerts, an operator should be
defined that will be notified if a job fails, or when an alert threshold triggers a notification to an operator.

To define an operator, you should use the following steps:

1. Start SQL Server Management Studio.


2. In the Connect to Server dialog box, ensure that the Server name is set to MIA-SQL, and then click
Connect.

3. To create an operator, in Object Explorer, expand the SQL Server Agent node, right-click Operators,
and then click New Operator.

4. In the New Operator window, in the Name box, type BIOperations.

5. In the E-mail name box, type [email protected].

6. Under the Pager on the duty schedule, select the Monday, Tuesday, Wednesday, Thursday, and
Friday check boxes.

7. To close the New Operator window, click OK.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-27

Demonstration: Creating an Operator


In this demonstration, you will see how to create an operator within SQL Server Agent.

Demonstration Steps
1. Start SQL Server Management Studio.

2. In the Connect to Server dialog box, ensure that the Server name is set to MIA-SQL, and then click
Connect.

3. To create an operator, in Object Explorer, expand the SQL Server Agent node, right-click Operators,
and then click New Operator.

4. In the New Operator window, in the Name box, type BIOperations.

5. In the E-mail name box, type [email protected].


6. Under the Pager on duty schedule, select the Monday, Tuesday, Wednesday, Thursday, and
Friday check boxes.

7. Click OK.
8. Close SQL Server Management Studio without saving any changes.

SQL Server Agent Alerts


SQL Server Agent alerts help you respond to errors
or performance condition thresholds, by either
sending an alert, executing a job, or performing
both at the same time.

For example, during a data warehouse load, a


transaction log file may become 90 percent full. A
performance condition alert can be set up to
automatically invoke a job when the transaction
log file hits 90 percent. This alert will back up the
transaction log and send an email message to the
BI operations team. As a result, potential errors
can be mitigated by setting an alert for a given
threshold.

SQL Server alerts can be set up to respond to three types of scenarios, but typically two scenarios are
used. These can include:

 Performance conditions. Performance objects are added to Performance Monitor in Windows


Server, and these can help you to monitor the real-time activities of SQL Server objects when SQL
Server is installed. SQL Server Agent utilizes these objects in alerts to notify an operator or start a job
at a predefined threshold of a specific object. You can control the alert to fire above or below a
specific object threshold.

 SQL Server events. For SQL Server events, alerts can be set up to alert an operator or start a job in
response to SQL Server error numbers, or error severity levels.
MCT USE ONLY. STUDENT USE PROHIBITED
5-28 Logging and Monitoring in BI Operations

 Windows Management Instrumentation (WMI) alerts. WMI is a technology that is used by


Windows operating systems to manage and monitor the Windows system. SQL Server uses WMI to
monitor a managed resource—this can be any object, including computer hardware, computer
software, and a service that can be managed. SQL Server alerts can be set up, based on a WMI class
library and the WMI scripting language that can query managed resources in a language that is very
similar to Transact-SQL, known as WMI Query Language (WQL).

SQL Server Agent alerts provide a useful feature that enables the BI operations team to be proactive in the
management of the BI estate. You should consider which servers to set up alerts for, based on the tier of
server they are assigned from an operational point of view.

Demonstration: Creating an Alert


In this demonstration, you will see how to create an alert within SQL Server Agent.

Demonstration Steps
1. Start SQL Server Management Studio.
2. In the Connect to Server dialog box, ensure that the server name is set to MIA-SQL, and then click
Connect.
3. To create a SQL Server alert, in Object Explorer, expand the SQL Server Agent node, right-click
Alerts, and then click New Alert.

4. In the New Alert window, in the Name box, type Log File size for EIM_Demo.

5. In the Type list, select SQL Server performance condition alert.

6. In the Object list, select Databases.

7. In the Counter list, select Percent Log Used.

8. In the Instance list, select EIM_Demo.


9. In the Alert if counter list, select rises above, and then in the Value box, type 50.

10. To define a response, in the Select a page pane, click Response, and then select the Notify
operators check box.
11. In the Operator list, select the E-mail check box.

12. To execute a job, select the Execute job check box, and then click New Job.

13. In the New Job window, in the Name box, type Backup EIM Log, and then in the Select a page pane,
click Steps, and then click New.

14. In the New Job Step window, in the Command box, type:

BACKUP LOG EIM_Demo TO Disk = ‘D:\Demofiles\Mod05\EIM_DemoLOG.TRN’

15. In the Step name box, type Backup.


16. In the Select a page pane, click Advanced, and then in the Retry attempts box, click the up arrow
once to set the number of attempts to 1. In the Retry interval (minutes) box, click the up arrow
once to set the time to 1, and then click OK.
17. In the New Job Step window, in the Select a page pane, click Notifications, and then select the E-
mail check box.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-29

18. In the first E-mail list, select BIOperators; in the second E-mail list, ensure that When the job fails is
selected.
19. On the Notifications page, select the Write to the Windows Application event log check box; in
the list, select When the job fails, and then click OK.

20. In the New Alert window, in the Select a page pane, click Options, select the E-mail check box, and
in the Additional notification message to send box, type generated by an alert from the SQL
Server, and then click OK.

21. Close SQL Server Management Studio.


Question: Which BI process do you schedule with the SQL Server Agent? Do you set up
alerts for these processes?
MCT USE ONLY. STUDENT USE PROHIBITED
5-30 Logging and Monitoring in BI Operations

Lab: Monitoring BI Solutions


Scenario
Adventure Works Cycles is a global corporation that manufactures and sells bicycles and accessories. The
company sells through an international network of resellers, and has a direct sales channel through an e-
commerce website.

Adventure Works employees are increasingly frustrated by the time it takes for business reports to
become available on a daily basis. The existing managed BI infrastructure—including data warehouses,
enterprise data models, and reports and dashboards—are valued sources of decision-making information.
However, users are increasingly finding it takes too long for the data to be processed in the overnight
load, resulting in reports not arriving to business users until the early afternoon.

You have been asked to support the BI operations team in devising a logging and monitoring solution
that will help Adventure Works understand what could be causing the issues that they are experiencing.
To that end, you will set up the logging and monitoring process in preparation for execution when
troubleshooting the issues in the environment.

Objectives
At the end of this lab, you will have set up:
 General logging and monitoring.

 Targeted logging and monitoring.

Estimated Time: 60 minutes


Virtual machine: 10988C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa55w.rd

Exercise 1: Setting Up General Logging and Monitoring


Scenario
You have advised the BI operations team that, to deal with the issues that they are currently experiencing,
they should set up a general logging and monitoring process that captures general information about the
hardware, the operating system, and the SQL Server BI components.

This will involve configuring the Windows event logs to store up to 50 MB of information and the SQL
Server error logs to capture information over 14 log files. You will need to configure a data collector that
captures information about all of the components, and a custom SSIS log that can be used to log and
monitor errors and warnings regarding the overnight SSIS load. You will also create a SQL Server Agent
job that captures the execution history of the EIM_Demo_DW_Load package when it executes.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Configuring Windows Event Logs

3. Configuring SQL Server Error Logs


4. Configuring a Data Collector

5. Creating an SSIS Custom Log

6. Creating a SQL Server Agent Job


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-31

 Task 1: Prepare the Lab Environment


1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. On the taskbar, click the File Explorer shortcut.

3. View the contents of the D:\Labfiles\Lab05\Starter folder.

4. Right-click Setup.cmd, and then click Run as administrator.


5. In the User Account Control dialog box, click Yes, and then wait for the script to finish.

 Task 2: Configuring Windows Event Logs


1. Open Windows Event Viewer.

2. Configure the Application Log to store 50 MB of data.

3. Configure the System Log to store 50 MB of data.

4. Configure the Security Log to store 50 MB of data.

5. All logs must be archived when full.

 Task 3: Configuring SQL Server Error Logs


1. Connect to the MIA-SQL Server Database Engine instance.

2. Configure the SQL Server error logs to use 14 files.

 Task 4: Configuring a Data Collector


 Create and start data collector named SQL BI Monitoring that contains the following counters:

Object Counter

Processor % privileged time

Processor % user time

Memory Pages/sec

Memory Available MBytes

SQL Server: Buffer Manager Buffer cache hit ratio

SQL Server: Buffer Manager Page life expectancy

LogicalDisk Average disk queue length

Network Interface Current bandwidth

SQL Server: SSIS pipeline 14.0 Buffer memory


MCT USE ONLY. STUDENT USE PROHIBITED
5-32 Logging and Monitoring in BI Operations

 Task 5: Creating an SSIS Custom Log


 Create a Custom Log named Errors and Warnings that captures executable and component statistics
for warning and error events.

 Task 6: Creating a SQL Server Agent Job


 Create a SQL Server Agent job named EIM_Demo BI Load that contains a SQL Server Integration
Packages step that executes the EIM_Demo_DW_Load package located in the MIA-SQL instance.
Verify that the job starts successfully (don't wait for it to complete).

Results: At the end of this lab, you will have configured:

Windows event logs.

SQL Server error logs.

A data collector.
An SSIS custom log.

A SQL Server Agent job.

Exercise 2: Setting Up Targeted Logging and Monitoring


Scenario
You have advised the BI operations team to be proactive by preparing for monitoring systems that can be
run immediately when they are required. To that end, you will create a SQL Server Profiler template for
both SQL Server and Analysis Services to monitor the queries that occur against the databases and the
cubes.

The main tasks for this exercise are as follows:

1. Creating a SQL Server Profiler Template for the Database Engine


2. Creating a SQL Server Profiler Template for Analysis Services

 Task 1: Creating a SQL Server Profiler Template for the Database Engine
 Create a SQL Server Profiler template that contains events to monitor queries that occur against the
SQL Server instance. Accept the default columns when selecting the events.

 Task 2: Creating a SQL Server Profiler Template for Analysis Services


 Create an Analysis Services Profiler template that contains events to monitor multidimensional cubes
and server errors. Accept the default columns when selecting the events.

Results: After completing this exercise, you will have created:

A SQL Server Profiler template.

An Analysis Services Profiler template.

Question: Would you have added any additional monitoring tools to the approach laid out
in this module?
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-33

Question: Which graphical tool would you use to identify locking and blocking that
currently exists on the server, with a view to terminating a process that is causing the
blocking?
MCT USE ONLY. STUDENT USE PROHIBITED
5-34 Logging and Monitoring in BI Operations

Module Review and Takeaways


A wide array of tools is available within SQL Server to help you and the BI operations team to log and
monitor the activity that is occurring on the SQL Server. It is important that you adopt the tools that will
provide you with the best information possible to resolve any issues. You should also be careful to select
only the information that can assist with troubleshooting. Having too much information will cause an
overload and slow down the process of identifying the cause of an issue and, ultimately, resolving it. The
need for logging and monitoring should now be clear, and you should be able to select the tools to
perform your role.
MCT USE ONLY. STUDENT USE PROHIBITED
6-1

Module 6
Troubleshooting BI Solutions
Contents:
Module Overview 6-1 
Lesson 1: Troubleshooting Failed BI Solutions 6-2 

Lesson 2: Troubleshooting the Data Warehouse 6-6 

Lesson 3: Troubleshooting SQL Server Analysis Services 6-10 


Lesson 4: Troubleshooting SQL Server Reporting Services 6-14 

Lab: Troubleshooting BI Solutions 6-17 

Module Review and Takeaways 6-20 

Module Overview
The task of trying to troubleshoot failed BI solutions can be complex. It requires an understanding of the
environments in which the BI solution is hosted, and an understanding of the workloads that take place
during the life cycle of the solution. Troubleshooting can be made easier if the BI operations team has
established defined standards for different tiers of servers for the configuration, security, and deployment
of the solution. Standards create a baseline environment for the servers and the solution so that the BI
operations team have a clear understanding of the environment that they are troubleshooting.
With this in place, when an issue is reported to the operations team, they can adopt a structured
troubleshooting approach that means they can resolve the issue, and understand the root cause—this
leads to a long-term fix. As these issues are occurring within live environments, it is prudent to follow a
process that is in line with operational procedures, so that you set the expectations for resolving an issue.
This will typically involve applying a fix that follows either standard operating procedures or emergency
operating procedures.

Objectives
At the end of this module, you will know the correct approach for troubleshooting:

 Failed BI solutions

 Data Warehouse

 Analysis Services

 Reporting Services
MCT USE ONLY. STUDENT USE PROHIBITED
6-2 Troubleshooting BI Solutions

Lesson 1
Troubleshooting Failed BI Solutions
Many aspects of a BI solution can fail—including a data warehouse load not completing, the Analysis
Services processing taking too long, or reports showing data that is out of date. Even if the loading of a BI
solution succeeds, the BI operations team may get service desk requests from users with problems,
including the inability to access an individual report, or reports taking too long to generate. A structured
troubleshooting approach means an operations team can:
 Understand the symptoms of the problem.

 Determine the root cause of a problem.

 Understand the impact of a fix.

 Apply the fix for the problem.

Lesson Objectives
After completing this lesson, you will understand:

 The troubleshooting approach.

 The importance of root cause analysis.

 The importance of impact analysis.

 Common troubleshooting scenarios.

 How to troubleshoot the operating system.

Troubleshooting Approach
A good troubleshooting approach should have in
place a broad logging and monitoring method
that will record information about the general
health of Windows®, SQL Server® and its related
components. Making use of the tools that provide
this information, such as Windows event logs, and
SQL Server error logs, will be the starting point for
identifying the areas that are having problems. In
certain circumstances, the answer to the problem
may lie in these log files—such as a hard disk
going offline, or a data warehouse database
running out of disk space. Otherwise, more
investigation may be required that will involve using other tools, and talking to the user who first raised
the issue.

The area of SQL Server that is experiencing problems will determine the type of tool that you use to
perform more targeted investigations. For many SQL Server components, you can use Windows Reliability
and Performance Monitor, using counters specific to the SQL Server component to perform further
analysis. SQL Server Profiler could be used to capture events that relate to the database engine or Analysis
Services. Custom logging in Integration Services may help in providing custom information about SQL
Server Integration Services (SSIS) data loads. Using Transact-SQL to query the report execution logs may
provide answers to slow running reports in SQL Server Reporting Services (SSRS). The aim of this targeted
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-3

analysis is to find evidence as to the root cause of a reported issue, and then discuss with the BI
operations team a proposed fix and the impact of applying that fix to an environment.

The Importance of Root Cause Analysis


Root cause analysis uses processes and tools to
identify the ultimate cause of a fault or problem
within a system. In many operational
environments, it is typical for a support team to
focus on the resolution of a visible problem,
without fully dealing with the original underlying
cause. Whilst a fix to the visible problem provides
a short-term solution, it does not guarantee that
the problem will not recur—and maybe have a
different visible problem. As such, the visible
problem may merely be a symptom of a more
serious underlying root cause issue. To avoid
reoccurrence, it is important for the BI operations team to spend time checking whether a reported issue
is actually a root cause issue, or merely a symptom of a wider problem.

This type of checking can be difficult in fast paced operational environments, where it is important to fix
an issue quickly. In such circumstances, it might be more pragmatic to apply a short-term fix to a
problem, and then perform a retrospective root cause analysis, with a view to applying a long-term fix.
The findings and supporting evidence should be documented, and associated against the service desk
ticket that raised the issue. This means the team can decide whether the issue could be problematic on
other servers within the operational environment—the fix could then be applied as a standard to all the
servers.
Alternatively, the root cause analysis might conclude that a proposed fix may not provide a long-term
solution, and that the scenario will resurface when a particular set of conditions occur on a server. In this
situation, it might be pragmatic to define an operating procedure for dealing with the reoccurrence. This
will provide the team with a signed-off agreement on how to deal with the issue in the future, and reduce
the time it takes to respond to the issue. Should the server require being placed offline when applying a
fix, then an emergency operating procedure would be defined; otherwise, a standard operating procedure
would be defined.

The Importance of Impact Analysis


When discussing a proposed fix with the BI
operations team, it is important to perform an
impact analysis. This means that the team can set
expectations for the decision makers, such as the
data director, about the impact of making the
change to the business, and when the fix should
be performed.
MCT USE ONLY. STUDENT USE PROHIBITED
6-4 Troubleshooting BI Solutions

The type of questions that should be considered include:

 Which SQL Server components are affected by the fix?

 Will any dependency SQL Server or application components be affected?

 Will any Tier 1 category servers be affected?

 When can the fix be performed?

 Does the fix require a server to be offline, and does this have a financial impact on the business?

If possible, the BI operations team should try to reproduce the issue in a nonproduction environment, and
perform a dry run through of applying the fix. This will give valuable information as to the outcome of
applying the fix, and further inform the impact analysis. In some organizations, a dedicated support
environment—that mirrors the production environment—may be made available for this purpose.

Common Troubleshooting Scenarios


Whilst you cannot cover all the possible scenarios
that would require troubleshooting, there are
common situations that will occur with many BI
solutions. Many of the reported issues from users
will surround the lack of access to a report, slow
report generation, or the complete unavailability
of reports. In addition, there are background
automated processes that will encounter issues
that also require troubleshooting, specifically in
the following areas:
 Data warehouse. The common issues that are
found in many data warehousing solutions
include slow data loads and package failures. Investigating failed packages will typically involve SQL
Server Agent history and the SSIS Catalog through standard or custom logging. Troubleshooting slow
data loads will involve using Transact SQL queries for data management view (DMV) information,
using SQL Server Profiler against the database engine.
 Analysis Services. Processing and querying performance are typically the main areas for
troubleshooting with Analysis Services. In this scenario, it is typical to use SQL Server Profiler and
Performance Monitor counters to investigate issues. You can open the msmdsrv.ini log file to confirm
the configuration of an Analysis Services instance should the instance not be accessible from SQL
Server Management Studio. It is important that you retrieve information about the setup of an
instance before collecting performance information.
 Reporting Services. The common issues that are associated with Reporting Services involve poor
report performance, limited report functionality, and subscriptions not being delivered. The SSRS
report execution logs can help to retrieve information regarding the performance of a report. You
can use the Reporting Services Web Portal interface to help with the resolution of subscription issues.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-5

Troubleshooting the Operating System


Before troubleshooting any of the BI components,
you should always check for errors with the
operating system and hardware—to eliminate the
underlying platform as a root cause of an issue. It
is common for a support team to overlook this
aspect of troubleshooting, but a quick check of
key metrics in this area may save a lot of time
when trying to resolve an issue. The checks that
you should perform when troubleshooting
include:

 Note the observable performance of the


operating system when you log in. Logging
on to the server will provide the first indication of an issue. If the login process and the time it takes
to load the desktop is too long, there could be an issue with the hardware configuration or the way it
operates.

 Check the task manager for the running processes and the current performance. Task manager
can provide a quick snapshot of the current performance and the processes that are running on the
server. You can view the processes tab to quickly assess the memory, CPU, disk, and network
consumption of a given process. You can use this to provide the first clue as to the main area of focus.

 Confirm the hardware configuration by running MSInfo32 or viewing System Information. This
can be a particularly important check for virtual servers, because it confirms that the hardware
specifications are performing as expected. There may be a situation where a virtual server has its
configuration changed unknowingly. For example, where the memory previously allocated to the
virtual server has been reduced, this would affect performance.
 View the system log in event viewer. The system log in event viewer provides useful information
about the operating system and the hardware. You should filter the system log to display errors and
warnings for the time period since the error occurred.
These cursory checks should not take more than five minutes to perform, but the information provided
will help you to quickly identify where you should focus your troubleshooting efforts, and eliminate the
operating system or hardware as the cause of the problem.
MCT USE ONLY. STUDENT USE PROHIBITED
6-6 Troubleshooting BI Solutions

Lesson 2
Troubleshooting the Data Warehouse
The data warehouse provides the foundations for the BI solution. To ensure it operates successfully, it
depends on a number of SQL Server components. The main components are the database engine (to
store the data), Integration Services (to move and transform the data), and the SQL Server Agent (to
initiate the data warehouse execution loads). As a result, troubleshooting will typically start in these areas.
If you adopt a structured approach to troubleshooting, you can narrow down the extent of your
investigations.

Lesson Objectives
At the end of this lesson, you will be able to troubleshoot:

 SQL Server Agent jobs.

 SSIS packages.
 The data warehouse.

Troubleshooting SQL Server Agent Jobs


The SQL Server Agent is typically used to initiate a
scheduled execution of one or more SSIS packages
to load and transform data into a warehouse. If an
issue is reported regarding the failure of SSIS
package execution, you should first try SQL Server
Agent.
Check that the SQL Server Agent is started and
check error logs
Before checking individual jobs for errors, you
should perform a visual check that the SQL Server
Agent is running in SQL Server Configuration
Manager or SQL Server Management Studio. If the
service has stopped unexpectedly, you should check the SQL Server Agent error logs for evidence as to
why this has happened.

Set up notifications for critical data warehouse jobs

SQL Server Agent jobs can be configured to trigger a notification when an agent job fails, succeeds, or
completes. You should configure this for key jobs that execute a data warehouse load. Notifications are
sent to an operator using either an email message or a pager. It is typical to send an email message to an
operator to notify them of a failure, so that a team member can be informed immediately.

Check the SQL Server Agent history of a failed SSIS job

A view of the Jobs node in the Object Explorer Details window will show you which jobs have failed, as
denoted by a red cross on the SQL Server Agent job. You can view the details of the history in the log file
viewer information—this includes the date, message, log type, and log source of the job. The amount of
history that is held is determined by how the properties of SQL Server Agent are set. In this area, you can
define the file size of the log viewer, and specify a time period when history will be removed from the log.
The information provided by the history will indicate which package and task failed. For custom logging
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-7

activities, it is important to set the logging level within the SQL Server Agent job to ensure that the
logging occurs.

Troubleshooting SSIS Packages


Whilst the SQL Server Agent can give information
about which package failed and during which task,
you might want more detail to further understand
if a package failure was a one-off event, or a
symptom of an underlying problem. You will need
to use other tools to provide this and the
supporting information. You should consider
making the following checks after the operating
system checks have been performed:

 Check the application log in event viewer


for any errors. The application log may
contain errors that are specific to the failing of
SSIS packages. You should correlate this with any other system and application logs that may indicate
why a package failed. For example, the system log may report network connection errors, while the
SSIS package reports an inability to connect to a network share.

 Look at the default views that are available within the SSISDB. When an SSIS Catalog is deployed
to SQL Server, and SSIS packages are deployed to the SSIS Catalog, a record of the package execution
is held inside the SSISDB. Many views can be used in the SSISDB database to retrieve information
from the SSISDB.

The following code can be used to retrieve the error messages that are recorded by an SSIS package
execution, by querying the operational_messages and operations view in the SSIS Catalog:

Querying error messages in the SSISDB


SELECT
OP.[object_name]
, OM.[message_time]
, OM.[message]
FROM catalog.operation_messages AS OM INNER JOIN catalog.operations AS OP
ON OP.operation_id = OM.operation_id
WHERE OM.message_type = 120

You can create your own custom Transact-SQL statements that return the information that you require
when troubleshooting failed packages:
 If configured, look through the SSIS custom logs. You can use SSIS to select the level of logging
that is used when a package is executed. This can include the default logging levels, or any custom
log that has been created. You can use the reports that are generated to receive targeted information
regarding any errors that will occur with the packages that are executed.

 If using, review the SSIS report packs. If you have downloaded the SSIS report pack from Microsoft
CodePlex, a dashboard is created in Reporting Services that will provide information about executed
packages. You can use this dashboard to identify which packages have failed.
MCT USE ONLY. STUDENT USE PROHIBITED
6-8 Troubleshooting BI Solutions

Troubleshooting the Data Warehouse


As the data warehouse stores the data that is
consumed by other BI applications, it is not
uncommon for investigations to take place in this
area, even if the issues relate to other components,
such as Analysis Services or Reporting Services.
Slow cube processing or report retrieval may have
its root cause in the underlying data warehouse.
As a result, the following checks should be
performed against the data warehouse:

 Pay more attention to the operating


system checks in the context of SQL Server.
Use SQL Server Configuration Manager or SQL
Server Management Studio to confirm that the instance of SQL Server on which the data warehouse is
stored has had its memory and CPU configured, in balance with the rest of the system.

You can also use the following SQL script to confirm the memory setting on a SQL Server instance:

Reporting SQL Server instance memory configuration


SELECT
name,
value,
value_in_use
FROM sys.configurations
WHERE name like '%server memory%'

 Check for fragmentation levels of the data warehouse. High fragmentation levels found in a
database are an indication that there is a lack of maintenance. Over time, if maintenance is not
performed on a database, query processing will take more time and other operations will slow down.
You can use the following query to provide information on the level of fragmentation of tables and
indexes within a database:

Identifying percentage fragmentation of data in a SQL Server database


SELECT
index_id
,index_type_desc
,avg_fragmentation_in_percent
,avg_page_space_used_in_percent
,page_count
FROM sys.dm_db_index_physical_stats
(DB_ID(N'EIM_Demo'), NULL, NULL, NULL , 'SAMPLED')
ORDER BY avg_fragmentation_in_percent DESC

 Identify any long running queries. You can use SQL Server Profiler to capture the queries that have
been running on the server. You can then analyze the results to identify any long running queries that
have been occurring during the execution of Profiler. As an alternative, and if the server has been
running for a lengthy period of time, you can also use DMVs to provide an average execution time for
queries.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-9

An example may include executing a script against the dm_exec_query_stats and the sys.dm_exec_sql_text
DMV to identify the top 10 long running queries.

Finding the top 10 longest running queries


SELECT TOP 10
creation_time
, last_execution_time
, execution_count
, total_elapsed_time / execution_count avg_elapsed_time
, SUBSTRING(st.text, (qs.statement_start_offset/2) + 1
, ((CASE
statement_end_offset
WHEN -1
THEN DATALENGTH(st.text)
ELSE qs.statement_end_offset END
- qs.statement_start_offset)/2) + 1) AS statement_text
FROM sys.dm_exec_query_stats AS qs
CROSS APPLY sys.dm_exec_sql_text(qs.sql_handle) st
ORDER BY total_elapsed_time / execution_count DESC;
MCT USE ONLY. STUDENT USE PROHIBITED
6-10 Troubleshooting BI Solutions

Lesson 3
Troubleshooting SQL Server Analysis Services
From an operational perspective, SQL Server Analysis Services can have issues in one of three main areas.
The cube processing may be slow; the query performance of the cube may be slow; or users may have
issues when trying to access the cube. Before performing in-depth investigations of the Analysis Services,
it is important to perform the operating system and data warehouse troubleshooting steps, because
Analysis Services has a dependency on these areas.
Once you are satisfied that the dependency technologies are not affecting Analysis Services, you can then
use a number of tools to provide information to help you resolve the common operational issues.

Lesson Objectives
After completing this lesson, you will be able to troubleshoot:

 Cube processing.
 Cube query performance.

 Cube access.

Troubleshooting Cube Processing


When processing a cube, the designed
aggregations are calculated and loaded into the
cube structure, along with the data. You process a
cube by querying the dimension tables to
populate the levels with members from the actual
data, and reading the fact table to calculate
aggregations. During the processing, this involves
using Transact-SQL queries to retrieve the data.
Should the cube processing take a long time, you
can perform the following checks to resolve the
situation:

 Identify any processing queries that would


benefit from indexes. You can use SQL Server Profiler or Transact-SQL queries to identify if there are
any long running queries that are being generated during the cube processing. This will help to
identify the queries and the tables that are being accessed.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-11

In addition, you can run the following query to return the suggested missing indexes to create in a
database:

Missing index query


SELECT
CAST(SERVERPROPERTY('ServerName') AS [nvarchar](255)) AS [SQLServer]
, db.[database_id] AS [DatabaseID]
, db.[name] AS [DatabaseName]
, id.[object_id] AS [ObjectID]
, id.[statement] AS [FullyQualifiedObjectName]
, mgs.[user_seeks] * mgs.[avg_total_user_cost] * (mgs.[avg_user_impact] * 0.01) AS
[IndexAdvantage]
, 'CREATE INDEX [Missing_IXNC_' + OBJECT_NAME(id.[object_id], db.[database_id]) + '_'
+ REPLACE(REPLACE(REPLACE(ISNULL(id.[equality_columns], ''), ', ', '_'), '[', ''), ']',
'')
+ CASE
WHEN id.[equality_columns] IS NOT NULL
AND id.[inequality_columns] IS NOT NULL
THEN '_'
ELSE ''
END + REPLACE(REPLACE(REPLACE(ISNULL(id.[inequality_columns], ''), ', ', '_'),
'[', ''), ']', '') + '_'
+ LEFT(CAST(NEWID() AS [nvarchar](64)), 5) + ']' + ' ON ' + id.[statement]
+ ' (' + ISNULL(id.[equality_columns], '')
+ CASE
WHEN id.[equality_columns] IS NOT NULL
AND id.[inequality_columns] IS NOT NULL
THEN ','
ELSE ''
END + ISNULL(id.[inequality_columns], '') + ')' + ISNULL(' INCLUDE (' +
id.[included_columns] + ')', '') AS [ProposedIndex],
CAST(CURRENT_TIMESTAMP AS [smalldatetime]) AS [CollectionDate]
FROM [sys].[dm_db_missing_index_group_stats] mgs
INNER JOIN [sys].[dm_db_missing_index_groups] mig WITH (NOLOCK)
ON mgs.[group_handle] = mig.[index_group_handle]
INNER JOIN [sys].[dm_db_missing_index_details] id WITH (NOLOCK)
ON mig.[index_handle] = id.[index_handle]
INNER JOIN [sys].[databases] db WITH (NOLOCK)
ON db.[database_id] = id.[database_id]
WHERE db.[name] > 'EIM_Demo'
ORDER BY [IndexAdvantage] DESC
OPTION (RECOMPILE);

It is important to note that the previously-mentioned query may make multiple index suggestions for the
same table. Additionally, it is important that multiple indexes are not created on the same table to the
detriment of performance, and that the indexes implemented are tested.
 Cube processing time-out. There may be occasions where the processing of the cube fails due to a
query time-out and the following error is returned: OLE DB error: OLE DB or ODBC error:
Operation canceled; HY008. This occurs as the result of a time-out expiry due with the
SQL_QUERY_TIMEOUT setting, meaning the command time-out or query time-out threshold was
reached, and the running query was cancelled. In this scenario, you can modify the server’s advanced
properties—specifically, the ExternalCommandTimeOut property—and increase the value to provide
more time for the processing queries to execute.
MCT USE ONLY. STUDENT USE PROHIBITED
6-12 Troubleshooting BI Solutions

Troubleshooting Cube Query Performance


Analysis Services provides tools that are specifically
targeted to address the poor query performance
of multidimensional cubes. By using query
logging, and combining this with the usage-based
optimization wizard, you implement targeted
aggregations that might improve the query
performance.

Use query logging to record the queries


against a cube

The query logging feature that is available within


SQL Server Analysis Services is extremely useful for
recording the queries that are issued against the
server. Query logging is configured in the Analysis Services instance properties.

 QueryLog\ QueryLogSampling. This specifies the query log sampling rate. The default value for this
property is 10, meaning that one out of every 10 server queries is logged.
 QueryLog\ QueryLogConnectionString. This specifies the connection to the query log database.

 QueryLog\ QueryLogTableName. This specifies the name of the query log table. The default value
for this property is OlapQueryLog.

 QueryLog\ CreateQueryLogTable. This is a Boolean property that specifies whether to create the
query log table. The default value for this property is false, which indicates that the server will not
automatically create the log table and will not log query events.

You can use the query log to identify the queries that are causing issues with the Analysis Services cubes.
The query log’s benefits are fully realized when used in conjunction with the Usage-Based Optimization
Wizard that is covered in the next module.

Troubleshooting Cube Access


In this area, you might start to see service desk
tickets from users because they will be accessing
the data from the cubes, either directly through
Microsoft Excel®, or through Reporting Services. It
is important to check whether the reported issue is
being experienced only by the user raising the
issue, or if it is more widespread and affecting a
group of users. To check for access to a cube or
tabular model, you can perform the following
checks:

Ensure basic network connectivity checks are


made

It is prudent to first check if the connectivity problem is caused by general network connection issues. You
should ask the user if they can access other applications or network resources first. In addition, you can
run standard network connectivity checks by using tools such as PING or Traceroute. If connectivity is
confirmed by using these tools, then application network connectivity checks can be performed—for
example, this can include connecting to Analysis Services through Excel or SQL Server Management
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-13

Studio. If connectivity is still not occurring, you should check if the instance of Analysis Services is a named
instance. You should then check that the SQL Server browser service is running in SQL Server
Configuration Manager, and then ensure that the correct port number is being used to connect to the
service.
To confirm which port number that a named instance of Analysis Services is running under, you should
perform the following steps:

Confirming Analysis Services port number

Open task manager and get the Process ID (PID) for msmdsrv.exe.
Open the command prompt and type netstat /abo >>c:\output.txt.
Look for the PID in the output file and the corresponding TCP IP:Port information for the
same PID.
To confirm whether you have the correct port number, open SQL Server Management Studio
and connect to Analysis Services using the IP Address:Port Number (for example,
192.168.1.1:5585).

If the default port 2283 is not used, you can update the port information in the instance properties in
Analysis Services, and then connect from the client machine, using the port to confirm if connectivity is
resolved.
Check user access and permissions

In this scenario, a user is prompted to authenticate to an Analysis Server but, when providing the
credentials, is unable to connect. The troubleshooting approach to this issue depends on whether the user
is accessing the cube directly, through an application such as Excel, or if they are accessing the service
through another application, such as Reporting Services.
By using a direct connection to Analysis Services, you can check the roles node to see if the user exists
within a role that has been granted access to the data model. If the management of the security is based
on Active Directory® groups, you might need to liaise with the team that manages the groups to confirm
the user’s membership; however, you should first confirm that the group is allowed access to the data
model. Connection to the data model through another application involves more investigation. In this
scenario, a user may connect to a Reporting Server that then connects to the Analysis Server to retrieve
the data—the user is authenticating against the Reporting Server, and then Reporting Services should be
retrieving the data from a data model on behalf of the user. This process is referred to as double hop
authentication. In this circumstance, you would have to liaise with the network team that has probably set
up the double hop authentication using delegation and impersonation in Active Directory, and the setspn
command.

Delegation is the process of giving an Active Directory account permissions to perform a task. An example
is the ability to impersonate another user account. Impersonation is the process of one account
impersonating the credential of another—for impersonation to work, delegation of this permission must
be done first. You can use the setspn command-line tool to manually register an application within Active
Directory, so that it appears as an object that can be managed. With this in mind, the Reporting Services
application can be registered within Active Directory as an object using setspn. You can then set up
delegation for the application in Active Directory.

In this scenario, user access may be revoked as the result of an incorrect setting in either the setspn
command line tool, or the incorrect setting of delegation in Active Directory. After you have confirmed
that the user can access the data model directly through a tool such as Excel, you will need to work with
the Active Directory team.
MCT USE ONLY. STUDENT USE PROHIBITED
6-14 Troubleshooting BI Solutions

Lesson 4
Troubleshooting SQL Server Reporting Services
Reporting Services is typically the most visible application that is used by users in a BI solution. The service
desk tickets that are submitted will be varied—they can range from functionality issues, such as report
parameters not working on a report as expected, to reports not rendering in a timely manner. As a result,
the BI operations team should become very familiar with this technology.
Reporting Services may provide symptoms to many of the underlying issues that have already been
discussed in this module. For example, a lack of access to an Analysis Services data model may initially be
described as a Reporting Services issue. The job of the BI operations team is to look beyond the symptoms
to find the root cause.

Lesson Objectives
After completing this lesson, you will be able to troubleshoot:
 Reporting problems.

 Subscription issues.

 Report access.

Troubleshooting Report Problems


Whilst many support requests will involve
subscriptions failing, or a failure to gain access to
the Report Server, other service desk tickets will
fall into the category of poor Report Server
performance or usability issues.

Report performance
Users may complain about the amount of time it
takes to pull a report when browsing on a Report
Server. The BI operations team can use the
Reporting Services execution log covered in
Module 7 to determine if the issue is related to
poor performance in retrieving the data from a
data source, or whether the time is mainly spent processing the data on the Report Server—or in the
rendering of the report.

Should the issue relate to the time it takes to retrieve the data from the data source, SQL Server Profiler or
DMVs can be used to establish the performance of the query in relation to the additional workload that
will be occurring against the data source.

With the processing of data on the Report Server, Windows Reliability and Performance Monitor contains
a counter named Memory Pressure State in the Report Server Service object—you can use this to
determine if the Report Server is suffering from memory pressure. You can then include additional
counters to see the state of the memory in relation to the operating system and other BI components.

Rendering performance issues might require you to make an adjustment to a report so that it renders in a
more efficient manner. The BI operations team should also keep on top of understanding whether a
service pack or cumulative update may help to resolve a rendering issue—not only with SQL Server, but
also with the technology that is rendering the report. For example, if there is an issue with rendering an
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-15

HTML report to Internet Explorer®, you will also want to consider applying updates to the rendering
technology.
Usability issues

Usability issues can be varied, but common usability issues relate to the following areas of Reporting
Services:
Report parameters usage with snapshots
When a report is configured to use a snapshot to store data, there is an impact on the report parameters
that are linked to query parameters. In this instance, the report parameter in question will be greyed out
and deemed unusable. This is because the query parameter only returns the information that is defined
against the report parameter when a snapshot is created. It takes an image of the data at the time of
creation, and this is then used when the report is accessed. To resolve this issue, you should either not use
a snapshot, or reconfigure the report parameter to use a filter instead.

Linked reports

A simple error where the user states that a report is not behaving as expected. A linked report is an
example of a base report that has different properties configured. The user will sometimes use a linked
report that has different properties configured. When the user runs the report, they will receive different
data to what was expected. In this situation, you should politely advise the user that they are accessing the
wrong report.

Missing historical reports

Making use of the snapshot feature in Reporting Services also provides the opportunity to save historical
copies of reports. You may receive service desk tickets saying that historical reports are missing. On the
assumption that the historical reports have not been deleted, you should check the retention period that
is configured for the report. This can be done in the properties of a report in the History page—you
should also check the default system-wide retention setting in Site Settings in the web portal.

Troubleshooting Subscription Issues


Reporting Services users can set up push delivery
of their reports through email messages or file
share. This can be managed with standard
subscriptions by the user themselves or by an
administrator using data-driven subscriptions.
Users may receive errors in the processing of
subscriptions—these can be confirmed by the BI
operations team for remediation.

Use the ReportServerService.log file to


determine the standard subscription status

The reportingserverservice_date.log file is accessed


in the Program Files\Microsoft SQL
Server\MSRS14.MSSQLSERVER\Reporting Services\LogFiles location, and a date is stamped against each
log file that is created. Within this file, you can look for the following error messages to determine the
subscription status:
MCT USE ONLY. STUDENT USE PROHIBITED
6-16 Troubleshooting BI Solutions

Subscription status Description

Failure sending mail The Report Server could not connect to the Email Server.
Check the email settings in Reporting Services Configuration
Manager.

Failure connecting to a The Report Server could not find the location defined in the
destination folder subscription. Check that the folder.

The file a could not be written The file a could not update or overwrite the file b. Check the
to file b subscription settings to allow new files to overwrite old files,
and then check the permissions of the folder.

Failure writing file a File a could not be written to a folder. Check the permissions
of the folder.

Troubleshooting Report Access


The troubleshooting approach used for accessing
Analysis Services should also be used with
Reporting Services. If the user receives the error
message “HTTP Status 401 Unauthorized Error”,
you should first check that the user has access to
Reporting Services, and then make the basic
network connectivity checks. Finally, double hop
authentication should be checked, especially if
Reporting Services is installed as a shared service
in a SharePoint farm.
In addition, some of the errors returned are
specific to Reporting Services. The
reportserverservice log file can be used to review the following errors:
 Secure Socket Layer (SSL) errors. If a user connects to the Reporting Server and receives the
following error message: “The underlying connection was closed: Could not establish trust
relationship for the SSL/TLS secure channel,” the troubleshooting focus should be on the secure
connectivity to a Report Server. The simple solution would be to ask the user to connect to the Report
Server using HTTPS instead of the standard HTTP protocol, and then confirm if access to the Report
Server is established.

 HTTP 400 bad request. If the error is described as: "The webpage cannot be found" or “HTTP 400
error”, the Report Server database might not be available. You can use the Reporting Services
configuration tool to verify that the database is configured. You can use the Services console
application in Administrator Tools to verify that the SQL Server Database Engine instance has started.

 HTTP 503 errors. These errors can occur either during report processing or when you first access a
Report Server. This indicates that the Report Server is suffering from high memory pressure and is
refusing to accept new connections. You should review the Reporting Services configuration in the
context of the other workloads on this system to ensure there is enough memory for the Report
Server.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-17

Lab: Troubleshooting BI Solutions


Scenario
Adventure Works Cycles is a global corporation that manufactures and sells bicycles and accessories. The
company sells through an international network of resellers, and has a direct sales channel through an e-
commerce website.

Adventure Works employees are increasingly frustrated by the time it takes for the business reports to
become available on a daily basis. The existing managed BI infrastructure—including data warehouses
and enterprise data models are valued sources of decision-making information. However, users are
increasingly finding it takes too long for the data to be processed in the overnight load, resulting in
reports not arriving to business users until the early afternoon.

You are supporting the BI operations team in dealing with service desk tickets that the team thinks will
handle the root cause of the issue that Adventure Works is experiencing.

Objectives
At the end of this lab, you will be able to troubleshoot:

 The data warehouse

 SQL Server Analysis Services

Estimated Time: 60 minutes

Virtual machine: 10988C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa55w.rd

Exercise 1: Troubleshooting Data Warehouse Loads


Scenario
The BI operations team has received a service desk ticket from the data director that is requesting an
investigation into an unresponsive BI server. This is not the first time that this situation has occurred, and
there is now involvement from the CIO of Adventure Works, who wants this recurring problem to be
resolved and closed. You need to use the appropriate tools that will help identify the root cause of the
issue and resolve it.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Using the Appropriate Logging and Monitoring Tools


3. Collecting the Evidence

4. Applying the Fix

 Task 1: Prepare the Lab Environment


1. Read the lab and exercise scenarios.

2. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

3. Run Setup.cmd in the D:\Labfiles\Lab06\Starter folder as Administrator.


MCT USE ONLY. STUDENT USE PROHIBITED
6-18 Troubleshooting BI Solutions

4. Using SQL Server Management Server, manually restore the AW_SSAS Analysis Services database
from the D:\Setupfiles\AW_SSAS.abf file.

 Task 2: Using the Appropriate Logging and Monitoring Tools


 Read the service desk ticket found in the MIA-SQL_Unresponsive.jpg file in the
D:\Labfiles\Lab06\Starter folder.

 Task 3: Collecting the Evidence


 Execute the monitoring tools identified in the previous exercise. This should take at least five minutes.

 Task 4: Applying the Fix


1. Apply a fix to the root cause issue, based on the findings and conclusions.

2. To test your fix, run the SQL contained in D:\Labfiles\Lab06\Starter\BI_LoadReset.sql.

3. Rerun the EIM_Demo BI Load job, but note that the job now completes with an error, which you will
resolve in the next exercise.

Results: After completing this exercise, you will have:

Used the appropriate logging and monitoring tools to identify the issue.

Resolved the unresponsive nature of the BI solution with a permanent fix.

Exercise 2: Troubleshooting SQL Server Analysis Services


Scenario
After applying the appropriate fix to the previous exercise, the BI operations team reports that there are
still issues with the loading of the entire BI solution using the AW_BI solution that has been deployed to
production. The SSIS package is failing when trying to process the Analysis Services cube.

Using the logging and monitoring tools that you have at your disposal, you will identify the root cause of
the problem and apply a fix to ensure that the BI load completes successfully.
The main tasks for this exercise are as follows:

1. Using the Appropriate Logging and Monitoring Tools

2. Diagnose the Issue


3. Applying the Fix

 Task 1: Using the Appropriate Logging and Monitoring Tools


 Execute the monitoring tools identified in the previous exercise. This should take at least five minutes.

 Task 2: Diagnose the Issue


1. Start SQL Server Management Studio as administrator and explore the SSAS EIM Demo cube.
2. Process the dimensions, then try to process the cube. Note that the cube fails to process due to a data
issue in the Agents dimension.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-19

 Task 3: Applying the Fix


1. Add data to the Agent dimension to enable the cube to be processed.

Note: After adding the data, do not forget to reprocess the dimension before processing
the cube.

2. After reprocessing the dimension process the EIM Demo cube. Note that another error occurs, this
time caused by a data issue in the Customers dimension.

3. Add a record to the Customers dimension to enable the cube to be processed.

4. Reprocess the Customers dimension, and then process the EIM Demo cube.

Results: After completing this exercise, you will have:

Used the appropriate logging and monitoring tools to identify the issue.
Resolved the unresponsive nature of the BI solution with a permanent fix.

Question: Discuss with the group the approach that you used to identify the root cause issue
of the problem.

Question: On reflection, is there anything you would change about the approach or the
tools that were used to troubleshoot the BI solution?
MCT USE ONLY. STUDENT USE PROHIBITED
6-20 Troubleshooting BI Solutions

Module Review and Takeaways


In this module, you have focused on troubleshooting a BI solution and learned how you can use SQL
Server native tools to help inform you about the errors that are occurring on the system. You also
explored the common metrics that are used by an operations team to help identify the cause of BI issues.
This was put into practice with a lab where you used the tools to identify and then resolve an issue. As a
result, you have learned how to:

 Troubleshoot failed BI solutions.


 Troubleshoot a data warehouse.

 Troubleshoot Analysis Services.

 Troubleshoot Reporting Services.


MCT USE ONLY. STUDENT USE PROHIBITED
7-1

Module 7
Performance Tuning BI Queries
Contents:
Module Overview 7-1 
Lesson 1: The Need for Performance Tuning 7-2 

Lesson 2: BI Queries to Performance Tune 7-5 

Lesson 3: Tools for Performance Tuning 7-10 


Lesson 4: Remediating Performance Issues 7-16 

Lab: Performance Tuning a BI Solution 7-21 

Module Review and Takeaways 7-24 

Module Overview
In this course, you have seen many of the operational activities that take place in an organization—they
will often lead to the provision of a long-term solution to an issue that has been occurring in a BI
environment. Sometimes, however, changes to resolve an issue that are made by the BI operations team,
such as optimizing the BI platform, may not have the desired results.

When the BI operations team are satisfied that they have exhausted all areas in attempting to resolve an
issue, they might need to work with the development team to look at tuning the query aspects of the BI
solution to improve performance. Many BI operations make extensive use of queries, and it might be
necessary to look at these queries in more depth to improve performance.

The BI operations team would also have to discuss taking advantage of BI component features to help in
performance. For example, a suggestion might be made that Reporting Services snapshots could be used
to help performance. However, the development team would need consulting to understand the impact
of using such functionality on the overall solution.

Objectives
After completing this module, you will be able to:

 Understand the need for performance tuning.

 Describe the BI queries required to carry out performance tuning.

 Use the tools for performance tuning.

 Remediate performance problems.


MCT USE ONLY. STUDENT USE PROHIBITED
7-2 Performance Tuning BI Queries

Lesson 1
The Need for Performance Tuning
Performance tuning is the process of making changes or improvements to a system to increase or
maintain performance. As the data loads in a BI system increase, there is typically a negative impact on
the BI system performance; therefore, modifications might need to be made to ensure performance
improvement or performance consistency. Many areas of the BI system can be modified to improve
performance but, occasionally, you will need to look into the code itself. You can seek help from the
developers, to see if changes can be made to the code, or if supporting objects can be created to meet
the performance tuning objective.

Before undertaking such an activity, it is important that the BI operations team remove any potential
obstacles to reviewing the code. A common piece of feedback that an operations team might receive is
that the performance issue has nothing to do with the code; instead, the platform on which the solution
resides is substandard. The BI operations team can remove this obstacle by optimizing the data platform
and providing the supporting evidence to the development team from the information collected in any
logging or monitoring activities—they can then confirm to the developers that the platform is optimized
in the best possible manner.
The key is to optimize all levels of the solution, and if optimization of the platform does not have the
desired results, then the BI code should be reviewed.

Lesson Objectives
After completing this lesson, you will be able to explain:

 The need for performance tuning.


 The best approach for performance tuning.

 How to adopt performance tuning activities in development activity.

Scenarios That Require Performance Tuning


BI solutions are constantly evolving. Changes to
functionality, changes in data volumes, or an
increase in the popularity of an aspect of the BI
solution, make the solution and the workloads
volatile. A solution that might have been optimized
at its delivery may quickly perform at a suboptimal
level when it is in a production environment. As a
result, performance tuning is necessary to make
changes to the solution, to ensure that it operates
within a performance level that is deemed
acceptable by the business. That sometimes
requires you to make changes to the queries that
are used in the solution.

Common scenarios that make use of queries include data warehouse loads, Analysis Services queries, and
Reporting Services activities—all can have an impact on the performance of a BI solution and should be
considered. In a single server scenario, there can be a focus on optimizing the service to take full
advantage of the hardware on which it is hosted. However, when services are shared on a single server,
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-3

there must be a balance of the services across the available hardware, in addition to the times when the
tasks will execute.

Performance tuning can make use of a range of tools, including Activity Monitor, Performance Monitor,
and data collectors. This module will explore using SQL Server Profiler, Execution Plans, and Database
Engine Tuning Advisor to deal specifically with providing information about improving the queries that a
BI solution might use.

SQL Server Enterprise Edition also includes the Query Store, a database level feature that gives you an
insight on query plan choice and performance for Transact-SQL queries. Query Store automatically
captures a history of the execution plans and statistics. You can review the various plans that have been
used over time, and even force a plan to be used by SQL Server on subsequent executions of a query. This
gives the BI operations team even greater flexibility to optimize queries—it can be used in a range of BI
scenarios, such as loading a data warehouse or retrieving reports.

SQL Server Profiler is an important tool for monitoring SQL Server Analysis Services (SSAS) query
performance. Operating in the same way as profiling for Transact-SQL, it gives you the opportunity to
identify any suboptimal queries. This is relevant for queries that are issued directly against the data model,
or for third-party applications that are querying the data model directly.

Performance Tuning Approach


When conducting the performance tuning of
queries, it is important to use a prescribed
methodology so that results are compared on an
equal basis. For this to occur, an environment that is
consistent with the production environment should
ideally be used, with the same settings, and a
workload that reflects the conditions that the
queries are working under. After the environment is
created, you should perform the following four
steps when performance tuning the queries:
 Establish a baseline for performance.
Information should be collected so that you
can understand the expected working patterns of the BI environment and the resources that allow it
to function. This is known as establishing a baseline. Baselining should account for different levels of
activity during a business cycle. This could be at day, month, or quarter level where there may be
differing BI activities. The baselining should also account for queries that run against a cold cache or a
warm cache, because performance will differ. Baselining will help the BI operations team make
decisions about activities that occur on the BI servers that fall below or above the expected baseline
level.

 Identify any bottlenecks to performance. As the BI solution grows, more demands are placed on
the solution and its resources. This can cause bottlenecks to performance through excessive use of
hardware, or an increase in the amount of locking and blocking, as the number of users increases. It is
important that you use the correct tool for identifying a performance bottleneck. For example, you
could use Task Manager, Performance Monitor or data collectors if the team suspects the issue
involves the hardware on the system. If queries are suspected to be a cause of locking and blocking, it
may be more appropriate to use SQL Profiler Activity Monitor to identify long-running queries, and
then follow up with execution plans to establish the reasons for the bottleneck.
MCT USE ONLY. STUDENT USE PROHIBITED
7-4 Performance Tuning BI Queries

 Implement a change. When improving the performance of a query, it might be that indexes are
added to speed up the retrieval of the data without needing to change the underlying code.
Alternatively, the query code itself might be changed. The change should be implemented so that it
can be measured.
 Measure the performance. It is important to rerun the processes with your suggested fix in place,
with a view to repeating the same monitoring process that first identified the issue. This will confirm if
an improvement has taken place. If not, the exercise can be repeated, either by trying a different fix,
or where the original fix is accepted and placed as a change in the production environment.

Adopting Performance Tuning Activities in Development Activity


Many BI solutions will undergo testing in
nonproduction environments. This testing will
typically have a bias towards ensuring that
functional tests are performed. However, the
opportunity to also include performance testing
during this period is often overlooked.
By including performance testing in the
development life cycle, you can help to reduce the
risk of performance issues when the code is
deployed to the production environment. Many
development teams are under pressure to deliver
code within a tight timeframe, and this can
sometimes affect the cost of the testing process—where performance testing is inevitably sacrificed to
meet project timelines.

The BI operations team could become involved in the testing process, and so add value by helping the
development and testing team. The BI operations team can conduct performance tests on the code, and
make suggestions for improvements. This will produce the following benefits:

 It provides an independent test of the code.

 It helps the BI operations team understand expensive queries.

 It reduces the risk of suboptimal code going into production.

 It ensures performance testing is completed.

Whilst this activity may not be seen as an operational task, taking this proactive approach can benefit the
business, ensuring that the risks are managed, and that the need for supporting the solution is reduced.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-5

Lesson 2
BI Queries to Performance Tune
Various activities that take place in a BI solution will make extensive use of queries—these can also occur
at different times during the BI operations window. Should there be an issue with performance, it is
important to collect the information that is required to help identify the issue, whilst minimizing the
impact of using the tools to collect the information. Therefore, you should be prudent with the tools that
are used.

Lesson Objectives
In this lesson, you will learn about the queries that are required to performance tune, with regards to:

 Transact-SQL queries.

 Analysis Services queries.

 SSIS data flows.


 Report generation.

Transact-SQL Queries
Various scenarios in a BI solution will use Transact-
SQL queries that you might need to optimize,
including:
 ETL data loads using the SSIS Execute SQL Task.

 Analysis Services processing.

 Querying MDS subscription views.


 Report generation.

ETL processes involve making use of Transact-SQL


queries using the Execute SQL Task; Analysis
Services processing uses Transact-SQL statements to
populate a cube with data; and Reporting Services will use Transact-SQL to retrieve data to populate the
reports.
It is important to identify the queries that are taking up the most time. This typically involves running a
dynamic management view (DMV) query that identifies the longest running queries. However,
information from other diagnostic tools might indicate a situation where this type of query alone is not
appropriate. For example, another diagnostic tool might indicate that there is high CPU usage. In this
case, executing a query regarding the longest running queries may not be the appropriate query type to
use. You may wish to run DMV queries that look at the execution of queries in the context of the CPU
time that a query takes.

As a result, you can run the following query that shows the top 10 cached execution plans that use the
most cumulative CPU time. It also includes information about the amount of logical and physical reads
that have been used in executing a query.
MCT USE ONLY. STUDENT USE PROHIBITED
7-6 Performance Tuning BI Queries

Using a DMV to identify queries with the highest cumulative CPU time in milliseconds
--Top 10 CPU Cumulative waits
SELECT TOP 10
[qs].[creation_time]
, [qs].[execution_count]
, [qs].[total_worker_time] as [total_cpu_time]
, [qs].[max_worker_time] as [max_cpu_time]
, [qs].[total_elapsed_time]
, [qs].[max_elapsed_time]
, [qs].[total_logical_reads]
, [qs].[max_logical_reads]
, [qs].[total_physical_reads]
, [qs].[max_physical_reads]
, [st].[text]
, [qp].[query_plan]
, [st].[dbid]
, [st].[objectid]
, [st].[encrypted]
, [qs].[plan_handle]
, [qs].[plan_generation_num]
FROM
sys.dm_exec_query_stats qs
CROSS APPLY
sys.dm_exec_sql_text(plan_handle) AS st
CROSS APPLY
sys.dm_exec_query_plan(plan_handle) AS qp
ORDER BY qs.total_worker_time DESC

It is important that the information collected from the general logging is used to inform you of the
appropriate tools or queries to use to further analyze a specific problem in a particular context. This can
help you further troubleshoot the problems with a query, using Execution Plans, Query Store or the
Database Engine Tuning Advisor.

Analysis Services Queries


Analysis Services querying will use either
Multidimensional Expressions (MDX) or Data
Expression Language (DAX) to retrieve information
from a data model. It is important to understand
that Analysis Services uses two components to
process a query, regardless of whether you are
using multidimensional or tabular data models:

 The formula engine. This is used to perform


the calculations that are required when an
MDX query is issued. Analysis Services will
return the data from the storage engine, apply
the calculations, and then return the result back
to the client.

 The storage engine. This stores the data and presents it to the formula engine when a query is
requesting data. It also determines the level of data that should be returned to satisfy the query being
processed by the formula engine.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-7

This appreciation of Analysis Services will help you to better understand the objects that you use in a tool
such as SQL Server Profiler. For example, the Query Subcube Verbose object in SSAS Profiler will list all the
requests that are made from the formula engine to the storage engine. The Data from Aggregations
object will inform Profiler if the data retrieved from the storage engine and sent back to the formula
engine is from aggregated data, rather than leaf-level detailed data. The absence of rows in Profiler from
the Data from Aggregations object may indicate that the aggregation design for the cubes needs
recreating.

For tabular data models, you would want to include the VertiPaq SE Query End object, in conjunction with
the duration column, and then apply a filter where EventSubclass = "0 - VertiPaq Scan". If the duration for
these events is more than 50 percent of the total duration of a query, this indicates that the issue lies
within the storage engine.

As previously discussed, it is important to understand if the query is being executed from a cold cache or
a warm cache, as this can affect performance.

You can clear the SSAS cache to ensure there is a cold cache by running the following XMLA query in SQL
Server Management Studio connected to SSAS. The execution of a query will populate the cache, and
subsequent executions of the same query will work from a warm cache.

Clearing the SSAS cache using XMLA


<ClearCache xmlns="http://schemas.microsoft.com/analysisservices/2003/engine">
<Object>
<DatabaseID>Insurance</DatabaseID>
</Object>
</ClearCache>

When analyzing SSAS queries, it is best practice to clear the cache first before starting an analysis, so that
you can observe the performance difference between a cold cache and a warm cache.

SQL Server Integration Services Data Movement


SQL Server Integration Services Data Movement
commonly occurs in one of two ways—either by
using an Execute Transact-SQL task that utilizes
Transact-SQL, or through a data flow component
that is natively built into SSIS.

The data flow task is broken down into the


following three components:

 Sources

 Transforms

 Destinations

Transformations can often be a source of performance issues, and this can be dependent on the type of
transformations that are used. SSIS deals with the following three categories of transformation:

 Non-blocking transformations. These transformations use the same buffer space in memory to
both consume the input data and output the transformed data. This has a minimal impact on the
performance of the SSIS data flow. Examples include the Data Conversion transform and the Derive
Column transform.
MCT USE ONLY. STUDENT USE PROHIBITED
7-8 Performance Tuning BI Queries

 Semi-blocking transformations. These transformations use one area of buffer space in memory to
consume the input data—this will create additional buffer space for the output data. These
transformations might introduce additional CPU threads to process the data. Examples include the
Merge transform and the Merge Join transform.
 Blocking transformations. These transformations place the heaviest performance burden on the
SSIS subsystem. These transformations use one area of buffer space in memory to consume the input
data, and this will create additional buffer space for the output data. They introduce additional CPU
threads to process the data. Examples include the Sort transform and the Aggregate transform.

SSIS logging will help to identify any transformations that are causing performance issues. In the case of
blocking transformation, it is more prudent to remove these transforms from the SSIS packages and
replace them with Transact-SQL equivalents. For example, a Sort transformation could be replaced by an
ORDER BY clause in a Transact-SQL statement when retrieving data from a data source.

For data movements that use the Execute SQL task, you should follow the steps for performance tuning
Transact-SQL queries. This includes using general logging and SSIS logging to identify the impact of
specific Execute SQL tasks, and then perform the targeted analysis with the appropriate query tuning tool.

In addition, you can advise the developer team to configure the package execution settings to optimize
the package for the production environment. Some properties to configure can include:
 RunInOptimized. A data flow property that improves performance by removing unused columns,
outputs, and components from the data flow.
 DefaultBufferSize. This determines the amount of memory that the package can use. The default is
10 MB but it can be set up to 100 MB.

 DefaultBufferMaxRows. This determines the number of rows that can be held in the buffer, within
the DefaultBufferSize limit—the default value is 10,000.
 EngineThreads. This sets the number of threads that the task can use during execution.

Report Generation
SQL Server Reporting Services reports can make use
of Transact-SQL, MDX or DAX queries to retrieve
data from a range of data sources. Reports are
manually pulled when the user browses the web
portal and clicks on a report. They can also be
scheduled to execute to facilitate the subscriptions
that are set up for users, or for the generation of
caches or snapshots.

It can sometimes be difficult to predict when


“pulled” will be executed. Scheduled activities are
managed by the SQL Server Agent so, before
performing any query tuning of reports, it is worth
considering how SQL Server Agent jobs are distributed across time. For example, it is not uncommon for
SQL Server Agent jobs to execute jobs at clearly defined intervals, such as quarter past, half past, or
quarter to the hour. A simple, even distribution of jobs will ensure that the server is not under an
unnecessarily heavy load—this can be done by adjusting the schedules to spread the load across the hour,
rather than at specific time intervals. However, with all this considered, you may have to conduct
performance tuning tasks to further improve the situation.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-9

Beyond this, you should consider using the Execution Logs in Reporting Services to determine which
queries are taking the longest to generate. You should compare this with the general logging and
monitoring to establish which part of the SQL Server’s subsystem is being impacted. You can then
determine the appropriate query tuning tool to use to provide further analysis.
MCT USE ONLY. STUDENT USE PROHIBITED
7-10 Performance Tuning BI Queries

Lesson 3
Tools for Performance Tuning
A wide variety of tools can be used to performance tune queries. Some of these tools might even provide
recommendations on how a given query can be improved. The tools that can be used to resolve common
SSIS, SSAS, and SSRS issues are outlined here. You may choose to use only one tool but it is likely that you
will use more of them to provide evidence for how you will apply a fix.

Lesson Objectives
In this lesson, you will see how to use:

 Execution Plans.

 Query Store.

 Database Tuning Advisor.

 The Analysis Services Usage-based Optimization Wizard.


 Reporting Services Execution Logs.

Execution Plans
The performance of queries in a BI solution may
sometimes be substandard and require analysis. To
that end, SQL Server provides Execution Plans so
that the user can see how a query is being
executed. It may also make suggestions on which
indexes can be created to improve the performance
of a query.

Execution Plans is a graphical tool that uses icons to


display how the execution of a query has occurred.
The output of the query execution plan is read from
top right to bottom left. A cost is allocated against
each icon to denote the cost of the operation
during the query. If you hover your mouse over an icon, a tooltip will appear, providing more information
about the operation that the icon represents.

Interpreting execution plans accurately involves understanding the setup of a server, understanding the
server’s workload, and understanding SQL Server’s internal table and data structures. When armed with
this information, there can be a substantial performance improvement in responding to a query execution
plan and modifying a query; or creating an index to improve the performance.

You can access Execution Plans when a query window is open in SQL Server Management Studio—you
can select Query on the menu bar, and then click Include Actual Execution Plan. Alternatively, you can
select Display Estimated Execution Plan without running the query.
The critical aspect of using Execution Plans is to look for the existence of a scan in the query operations. A
scan is a query operation that involves looking through the entire contents of a table or an index. In some
cases, this may be appropriate, but usually it is more efficient to perform index seeks. Adding indexes to a
table will improve query performance for targeted searches. The following table shows the scan and seek
icons to look for in the execution plan:
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-11

Icon Operator

Clustered Index Scan

Clustered Index Seek

Nonclustered Index Scan

Nonclustered Index Seek

Table Scan

A figure that shows a percentage of the relative cost of the operation, in the context of the whole query,
will appear under the icon—you should look for the icon with the highest percentage. Further information
can be found on any icon by hovering over the icon and displaying a tooltip. This returns the following
additional information:

Tooltip icon Description

Physical Operation The physical operator used, such as Hash Join or Nested Loops.
Physical operators displayed in red indicate that the query
optimizer has issued a warning, such as missing column
statistics or missing join predicates. This can cause the query
optimizer to choose a less efficient query plan than otherwise
expected.
When the graphical execution plan suggests creating or
updating statistics, or creating an index, the missing column
statistics and indexes can be immediately created or updated
using the shortcut menus in SQL Server Management Studio
Object Explorer.

Logical Operation The logical operator that matches the physical operator, such
as the Inner Join operator. The logical operator is listed after
the physical operator at the top of the tooltip.

Estimated Row Size The estimated size of the row produced by the operator
(bytes).

Estimated I/O Cost The estimated cost of all I/O activity for the operation. This
value should be as low as possible.

Estimated CPU Cost The estimated cost of all CPU activity for the operation.

Estimated Operator Cost The cost to the query optimizer for executing this operation.
The cost of this operation as a percentage of the total cost of
the query is displayed in parentheses. Because the query
engine selects the most efficient operation to perform the
query or execute the statement, this value should be as low as
possible.

Estimated Subtree Cost The total cost to the query optimizer for executing this
operation and all operations preceding it in the same subtree.

Estimated Number of Rows The number of rows produced by the operator. This tooltip
item displays as Number of Rows in an Actual Execution Plan.
MCT USE ONLY. STUDENT USE PROHIBITED
7-12 Performance Tuning BI Queries

There are many other icons, such as Sort and Delete, that are returned by execution plans. Looking for
scans and seeks would be the starting point for examining the execution plans of queries.

Query Store
The SQL Server Query Store can provide insights
into current and historical use of the query plans
that are stored in the buffer memory on a SQL
Server instance. This capability allows you to find
and fix an execution plan performance by forcing a
previous query plan that worked better.

By default, the Query Store must be enabled within


the database where you want to capture the
execution plans. You can enable the Query Store by
using the following Transact-SQL statement:

Enabling Query Store on a database


ALTER DATABASE AdventureWorks SET QUERY_STORE = ON

A Query Store node will appear under the database that has been enabled. Within this is a node named
Regressed Queries that will show a record of multiple execution plans for the queries that execute against
a database displayed as execution plans. In the Regressed Queries window, you can view the queries and
the associated execution plans for the query. You can also order the queries in the list based on various
criteria, such as CPU time, logical reads, and physical reads. The Query Store makes it much easier to
correlate execution plans against a set of criteria.

You can also use a feature known as Plan Forcing to enable the query optimizer to use a specific
execution plan for a given query. To force a plan, select a query and plan in the Regressed Queries
window, and then click Force Plan. You can only force plans that were saved by the query plan feature
and are still retained in the query plan cache.
A number of additional options can be defined when enabling the Query Store; SQL Server provides
DMVs to give information about the state of the Query Store.

Database Engine Tuning Advisor


The Database Engine Tuning Advisor tool provides
advice on the appropriate indexes and partitions to
use—or not to use. This tool performs its analysis
with a provided workload file that you specify—it
then applies it to a database or individual tables
and makes recommendations on the indexes to use.
The analysis is only as good as the workload file
that you provide. A small workload file or a
workload file that does not reflect the queries
issued against your server will lead to the Database
Engine Tuning Advisor tool presenting inaccurate
results. If you use an appropriate workload file, the
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-13

Database Engine Tuning Advisor tool can be a useful resource in making appropriate index tuning
recommendations.

The Database Engine Tuning Advisor is a separate tool that is found in the Windows® Start menu. First,
you must authenticate to an instance of SQL Server, and define a name for the work. Next, you specify the
workload file or table. There is also the option to use the query cache as a source of the workload file. This
is useful in scenarios where the server has been running for a long period of time.

You should then select the database and/or individual tables to tune. There are a range of tuning options
to use, such as tuning for Physical Design Structure. You can determine which partitioning strategy to
apply, and there are also advanced options that help you to restrict the amount of space used for the
recommendations, and whether or not to allow online index recommendations. When the options are set,
you can start the analysis.

On completion of the analysis, two additional tabs are presented. The Recommendations tab occurs with
an estimated improvement, where you have a breakdown of partition recommendations, if specified, and
index recommendations. This tab provides information including the database name, the object name,
and the recommendation. The Reports tab has a summary of the tuning process that has taken place.

There are also Tuning reports where you can select a specific report, such as an Index Usage report, that
will give you a list of indexes that are used, and the number of references to it, based on the workload file
that has been provided.

In the Recommendations tab, you can click a recommendation, and deselect all recommendations or
individual recommendations. You can then go to Action and Apply or save or evaluate the
recommendations.

Analysis Services Usage-Based Optimization Wizard


The Query Logging feature used in Analysis Services
can be helpful when used in conjunction with the
Usage-Based Optimization Wizard.
You can use the Usage-Based Optimization Wizard
to design the aggregations in a cube that are based
on the queries that are run against the data model.
This is a better alternative to the Aggregate Design
Wizard because the aggregation design is based on
the user’s query habits, in addition to the CPU and
storage restrictions against Analysis Services.

On initial deployment, it is typical to use the


Aggregation Design Wizard to provide overall
performance benefits. After the data model has been in the production environment for some time, the
Usage-Based Optimization Wizard can focus the aggregation design based on user queries.

The Usage-Based Optimization Wizard can be accessed in the Aggregations tab of the Cube designer
within Visual Studio. You can also access the wizard within SQL Server Management Studio by right-
clicking a partition within Object Explorer. The wizard asks you to perform the following steps:

 Select the partitions to modify. In the Usage-Based Optimization Wizard, this screen will allow you
to choose whether to apply the wizard to the entire cube or to specific partitions.
MCT USE ONLY. STUDENT USE PROHIBITED
7-14 Performance Tuning BI Queries

 Specify query criteria. In the Usage-Based Optimization Wizard, this screen allows you to select
criteria for the queries that you want to optimize. Queries can be selected based on date, user or
frequency.

 Review the queries that will be optimized. In the Usage-Based Optimization Wizard, this screen
allows you to select the specific queries that will be optimized from the queries returned by the
options defined within the Specify Query Criteria screen.

 Specify object counts. In the Usage-Based Optimization Wizard, this screen allows you to perform a
count of the objects within the cube and the dimensions that will provide a realistic level of data that
the data model will store.

 Set Aggregation Options. In the Usage-Based Optimization Wizard, this screen allows you to choose
the aggregation storage options, based on hard disk limits or performance limits.
 Complete the Wizard. In the Usage-Based Optimization Wizard, this screen allows you to specify
how the partitions are created and deployed before finishing the wizard.
The more query logging data that can be collected, the more valuable the Usage-Based Optimization
Wizard will be in designing effective aggregations. As a result, you might want to run Query Logging for a
period of time before using the data to optimize the aggregations.

Report Execution Logs


Reporting Services provides useful views within the
Report Server database that enables you to view the
query activity that occurs on the Report Server. The
ExecutionLog, ExecutionLog2 and ExecutionLog3
views can each be queried using Transact-SQL to
find the most common queries executed.

The most important aspect of these views is that


they can also be used to determine the time it takes
for a report to:

 Retrieve the data with the TimeDataRetrieval


column.

 Process the data with the TimeProcessing column.

 Render that report with the TimeRendering column.

If the information for the TimeDataRetrieval column is a high value, this should prompt you to focus on
optimizing the query that returns data to the report using features such as Execution Plans or Query Store.

If the TimeProcessing value is high, this might indicate that there is pressure on the Reporting Services
system. Performance Monitor counters for the operating system and Reporting Services can be used to
confirm if this is the case—particularly the Memory Threshold state found in Reporting Services object, or
the % Processor time in the Processor object. Viewing the Disk Queue length for the drives on which SSRS
is stored can also provide information about the disk subsystem.
A high TimeRendering value should prompt you to look at how the report has been constructed. For
example, if there is an external image used within the report that is stored on a network share, it may take
time to retrieve the image from the network share; therefore, embedding the report within the report
itself will speed up the rendering process.

You can also join the contents of the view with other views and tables within the ReportServer database.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-15

For example, the following query joins the ExecutionLog view with the Catalog table to return the name
of the report item and how it has performed at any given time:

Retrieving report execution information


SELECT
[C].[Name]
, [EL].[TimeDataRetrieval]
, [EL].[TimeProcessing]
, [EL].[TimeRendering]
, [EL].[TimeDataRetrieval]+[TimeProcessing]+[TimeRendering] AS TotalTime
, [EL].[Format]
, [EL].[Parameters]
, [EL].[username]
, [EL].[TimeStart]
FROM dbo.ExecutionLog EL
INNER JOIN dbo.Catalog C
ON EL.ReportID = C.ItemID
MCT USE ONLY. STUDENT USE PROHIBITED
7-16 Performance Tuning BI Queries

Lesson 4
Remediating Performance Issues
A number of techniques can be used to remediate performance issues that occur with a BI solution.
Before embarking on remediation, you should ensure that the data platform is stable, and that any
existing errors are resolved before performing changes to improve the queries. Using these techniques
effectively requires a deep understanding of the data model—the BI operations team should liaise with
the development team to ensure that any proposed changes are in line with the data model that has been
developed. Not all of the techniques outlined will necessarily solve an issue, and it is important that
changes are tested before they are deployed into production.

Lesson Objectives
At the end of this lesson, you will remediate performance issues by using:

 Indexing.
 Analysis Services partitioning.

 Report caching and snapshots.

 Refactoring queries.

Using Indexes
Indexes are SQL Server objects that can improve the
performance of retrieving data that uses Transact-
SQL queries. There is a cost associated with indexes
because they consume disk space; however, if
indexes are used in a pragmatic way, the benefits
they bring outweigh this cost. In addition, indexes
can slow down insert and update operations that
occur on the database. These operations are
predictable in a data warehouse and the indexing
can be managed so that there are no indexes when
data is being loaded or updated in a data
warehouse. Indexes can then be placed on the
relevant tables when the load is completed, ready for other BI components to use.

SQL Server stores data in an 8 KB page, which is grouped into 64 KB extents. A single table may be spread
over many extents across the disk in a data structure called a heap. For SQL Server to retrieve this data, it
must identify the extents that belong to a table.

Indexes bring order to the data and come in two forms. In a traditional indexing structure there are
clustered indexes and nonclustered indexes. You can create only a single clustered index against one or
more columns of a table or a view. When this is done, the actual data that is stored in a heap is physically
and logically reordered into a contiguous space on the disk. This makes the retrieval of the data quicker,
especially when a query is based on a column that is part of the index.

Nonclustered indexes are typically created to support the queries that are used to retrieve data from a
table or a view. There can be up to 1,000 nonclustered indexes per table. Nonclustered indexes do not
physically reorder the data; instead, they create an index of pointers to where the data is stored on the
disk. Queries that look for matching values or return a small range of values will perform better with this
type of indexing.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-17

Typically, clustered indexes are created on key columns to organize the data, with nonclustered indexes
created to support the queries that are issued against the data warehouse.

You can create indexes using Transact-SQL statements.

Creating clustered and nonclustered indexes


--Creating a clustered index on the PolicyID column on the
--FactPolicyQuotes table named CL_FactPolicyQuotes_PolicyID

CREATE CLUSTERED INDEX CL_FactPolicyQuotes_PolicyID


ON dbo.FactPolicyQuotes(PolicyID);

--Creating a nonclustered index on the City column on


--the DimCustomer table named NCL_DimCustomer_City

CREATE NONCLUSTERED INDEX NCL_DimCustomer_City


ON EDW.Customer(City);

Columnstore indexes are ideal for large data stores that are typically associated with data warehouses—
especially those that store fact table data. The key benefit of columnstore indexing is that it can compress
the data to a high ratio, providing much higher performance than that previously achieved with
traditional indexes. With SQL Server, the capability has been enhanced to include columnstore indexes for
use with real-time analytics on operational workloads.
In the context of data warehousing, when columnstore indexing is used with an aligned table partitioning
strategy, the retrieval of data is much faster, particularly with full table scans. In SQL Server, you can also
use one nonclustered index that helps when performing targeted or range searches against tables that
have a defined columnstore index. As a result, a columnstore index would be created against an entire
table to take advantage of the compression and performance, with a nonclustered index created on a
column that has either queries in range or targeted queries.

Columnstore indexes can also be created using Transact-SQL statements.

Creating columnstore indexing


--Store the entire FactPolicyQuotes table as a columnstore index.
CREATE CLUSTERED COLUMNSTORE INDEX CS_FactPolicyQuotes ON FactPolicyQuotes;
GO

--Add a nonclustered index on the DateKey column of the FactPolicyQuotes table.


CREATE UNIQUE INDEX CSN_FactPolicyQuotes_DateKey ON FactPolicyQuotes (DateKey);

Analysis Services Partitioning


Whilst designing aggregations using the Usage-
Based Optimization feature will improve the
performance of the queries that are issued against a
data model, you should also consider improving it
further by using partitioning. This enables you to
physically distribute the data within the cube across
multiple physical disks. It can also reduce the time it
can take to process a cube by processing at a
partition level, rather than at the cube level.
MCT USE ONLY. STUDENT USE PROHIBITED
7-18 Performance Tuning BI Queries

Furthermore, in multidimensional mode, each partition can be configured with its own storage mode. For
example, you may create a partition that stores the current year’s data in multidimensional online
analytical processing (MOLAP) storage mode, and stores the historical data in a separate partition in
hybrid online analytical processing (HOLAP) storage mode. Further performance gains can be achieved if
the Analysis Services cube partitioning is aligned to the table partitioning strategy that is created in data
warehouse tables. By default, a partition is created for each measures group within the cube. You should
right-click the partition and click Delete.

To create a new partition, click the partition node, and then click New Partition. Click Next and define
the source information, which would typically be a fact table. When you click the next screen you can then
define a query that includes a WHERE clause to restrict the data that will be stored in a partition. For
example, you could use a join between the fact table and a DimDate table on the orderdate key and the
datekey; then, in the WHERE clause, the FulldateAlternatekey is set to a date range that restricts the data
by a date range.

You can specify the processing location of the partition, which is either the current server instance or a
remote instance of Analysis Services. You can then define the storage location, which can be the default
server location or a location of choice. Click Next and define a name for the partition. You can design the
aggregations for the partition now, design the aggregation later, or copy the aggregation from an
existing partition before finishing off the wizard.

Report Caching and Snapshots


The option to configure caching and snapshots is
available in all versions of SQL Server and is now
more simplified with Reporting Services. You can
use Reporting Services to configure caches and
snapshots to improve the performance of retrieving
report data. When a report is deployed to the
Report Server, by default, no caching or snapshot
settings are defined. As a result, when you run a
report, the most recent copy of the data is retrieved
from the data source. A copy is stored within the
http session cache for the user. However, should
another user open another connection to the same
report, the data will be retrieved again from the data source.

Caching

To improve the performance of retrieving the report data, you can cache a temporary copy of the report.
With this setting defined, when a report is first run, it is retrieved from the data source and stored in the
cache within the ReportServerTempDB database. Subsequent execution of the same report retrieves the
report data from the cache. It is important to note that, on a server reboot or a service restart, the
contents of the ReportServerTempDB database is emptied.

Snapshot

An alternative approach is to create snapshots. Snapshots can be created based on a schedule in advance
of the user browsing the report. The report snapshot is stored within the ReportServer database and the
user will browse the report snapshot stored within the ReportServer. Snapshots can also be used as the
basis to store historical copies of the report for future reference.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-19

Impact on report parameters

If a report contains parameters, the parameter value defined determines the data that is returned to the
cache. When parameters are used with a snapshot, the parameter value cannot be changed. However,
filters return all of the data to the report server cache. Furthermore, if a snapshot is defined on a report
with a filter, the report parameter that uses the filter can have its value changed.

You can configure the caching and reporting options by using the Manage option.

Refactoring Queries
Transact-SQL is a declarative language—this means
that a query can be written in a number of different
ways to return the same results. However, queries
might perform at different speeds. As a result, it
may sometimes be more pragmatic to rewrite a
query to be more efficient.

When a query is executed, it operates on the


following Transact-SQL for a given query, in this
order:

 The FROM and the JOIN clause identifies the


sources of the data.

 The WHERE and the SELECT clause will then filter the data from the source to produce a subset of
data.

The same steps are used in an aggregate query, but then SQL Server will evaluate the columns in the
GROUP BY clause to group the data together. If a HAVING clause is used in conjunction with the GROUP
BY clause, a filter is applied to the grouped data. If an ORDER BY clause is in the query, it will then sort the
results. You can improve the performance of the query by including columns in the index that are in the
query. This can have a noticeable impact when including columns that are defined in a WHERE clause.
This can be especially useful when working with a GROUP BY clause, as the WHERE clause can be used to
filter the set of data before the GROUP BY clause is applied. If you use the HAVING clause to filter the
data, using an index is unlikely because an aggregation operation is being performed.

However, if this does not work, there are a number of guiding principles to consider should a query need
rewriting from a performance point of view:
 Use set-based logic rather than cursor logic. The query optimization process in SQL Server is
optimized to interpret and optimize set-based operations. Procedural or cursor-based logic cannot
take full advantage of the capability that the query optimizer has to offer, so it will perform at a
suboptimal level.

 Avoid using query hints. If you are experiencing performance issues with a query, look out for query
hints that may be explicitly defined in a query. Query hints direct how a query should retrieve data
and ignore the suggestions made by the query optimization process. It is worth baselining the
performance of a query with the query hint in place—you can then comment out the hint and rerun
the query to establish the delta in performance. There are limited scenarios where white papers might
advise the use of query hints; however, subsequent cumulative updates may negate the need to use
them if there is a fix in the update.
MCT USE ONLY. STUDENT USE PROHIBITED
7-20 Performance Tuning BI Queries

 Rewrite SQL queries to remove correlated subqueries. Correlated subqueries evaluate the data
that exists in multiple tables on a row-by-row basis and therefore impact the performance of the
query. The EXIST clause may introduce an improvement, but it is better to try to rewrite correlated
subqueries as a JOIN query, to take full advantage of the query optimization process.
 Avoid using scalar use defined functions in the WHERE clause. Avoid using scalar use defined
functions in the WHERE clause because they are not optimized as part of a query plan and are
evaluated on a row-by-row basis.

 Rewrite SQL to simplify a query using CTEs or Temp tables. Try to break up long and complex
queries into smaller units of work by using Common Table Expressions or temporary tables. This will
reduce the IO required for the query optimizer to find a good plan against a subset of data that can
be joined later to produce a final result set.

These guiding principles provide a starting point for refactoring a query. However, you should always be
guided by the evidence that is presented in the monitoring tools, such as Execution Plans or the Query
Store.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-21

Lab: Performance Tuning a BI Solution


Scenario
Adventure Works Cycles is a global corporation that manufactures and sells bicycles and accessories. The
company sells through an international network of resellers, and has a direct sales channel through an e-
commerce website.

You are a consultant working with the BI operations team to improve the operational management of
their current BI solution. You have recently been working with the operations team to remediate issues
that have occurred with the BI solution. This has been resolved and now the focus of the team is to look
for improvements in the extracts of source data within the BI solution.

Objectives
At the end of this lab, you will be able to:

 Performance tune BI queries.

 Use the SQL Server Query store.


 Remediate performance issues.

Estimated Time: 45 minutes

Virtual machine: 10988C-MIA-SQL

User name: ADVENTUREWORKS\Student

Password: Pa55w.rd

Exercise 1: Performance Tuning BI Queries


Scenario
The BI operations team have used SQL Server Profiler to identify three queries that appear to be long
running, and want to look at ways to improve the performance of these queries. The queries have been
stored in the D:\Labfiles\Lab07\Starter folder and are named Query1.sql, Query2.sql and Query3.sql. You
have decided that the team should use Execution Plans to identify what is causing the queries to be slow.

The main tasks for this exercise are as follows:

1. Prepare the Lab Environment

2. Analyzing Queries with Execution Plans

3. Identify Performance Issue with BI Queries

 Task 1: Prepare the Lab Environment


1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. Run Setup.cmd in the D:\Labfiles\Lab07\Starter folder as Administrator.

 Task 2: Analyzing Queries with Execution Plans


1. Open SQL Server Management Studio.

2. Connect to the MIA-SQL SQL Server instance.

3. Open the file Query1.sql located in the D:\Labfiles\Lab07\Starter folder.

4. Execute the query with the Include Actual Execution Plan option set when running the query.
MCT USE ONLY. STUDENT USE PROHIBITED
7-22 Performance Tuning BI Queries

5. Open the file Query2.sql located in D:\Labfiles\Lab07\Starter folder.

6. Execute the query with the Include Actual Execution Plan option set when running the query.

7. Open the file Query3.sql located in D:\Labfiles\Lab07\Starter folder.

8. Execute the query with the Include Actual Execution Plan option set when running the query.

 Task 3: Identify Performance Issue with BI Queries


1. Collaborate with two or three other students.

2. Use Queries.docx in the D:\labfiles\Lab07\Starter folder as a framework to identify the issues with
the queries based in the execution plans.

3. Close both Word documents.

Results: At the end of this exercise, you will be able to:

Trace the actual execution plan of queries.

Use this information to identify performance issues with queries.

Exercise 2: Exploring SQL Server Query Store


Scenario
The BI operations team are unaware of the new functionality that is available with the SQL Server Query
Store in SQL Server. You will explore the capability of the Query Store and show the information that this
feature can provide the team when analyzing queries that are executing against the server.
The main tasks for this exercise are as follows:

1. View the Overall Resource Consumption Report

2. View the Top Resource Consuming Queries Report


3. View the Tracked Queries Report

 Task 1: View the Overall Resource Consumption Report


 Open Query Store and view the Overall Resource Consumption report. Configure the report to
display the resources consumed in the last hour.

 Task 2: View the Top Resource Consuming Queries Report


1. View the Top Resource Consuming Queries report.

2. Unforce an execution plan against a query.

 Task 3: View the Tracked Queries Report


 View the Tracked Queries report.

Results: At the end of this exercise, you will be able to:

View the Overall Resource Consumption report.


View the Top Resource Consuming queries.

Unforce an execution plan for a query.

Track a query with the Tracked Queries Report.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-23

Exercise 3: Remediating Performance Issues


Scenario
Based on the findings in Exercise 1, you are going to remediate the performance issue of the three queries
that have been brought to your attention by the BI operations team. You will either use indexes to
improve the performance of the query or you will rewrite the query so that it performs in a more optimal
manner.

The main tasks for this exercise are as follows:

1. Rewrite Query1 for Better Performance

2. Rewrite Query2 for Better Performance

3. Using Indexes to Improve Query Performance

 Task 1: Rewrite Query1 for Better Performance


1. Rewrite Query1.sql to improve the performance.

2. View the execution plan for the rewritten query.

 Task 2: Rewrite Query2 for Better Performance


1. Rewrite Query2.sql to improve the performance.
2. View the execution plan for the rewritten query.

 Task 3: Using Indexes to Improve Query Performance


1. Create a clustered index to improve the query performance of Query3.

2. View the execution plan for the rewritten query.

Results: At the end of this exercise, you will have:

Rewritten a query for better performance.

Added indexes to improve query performance.

Question: How often do you get the opportunity to review the production queries that are
working on your systems?

Question: Will you use the Query Store feature? What benefits do you see it bringing to
your organization?
MCT USE ONLY. STUDENT USE PROHIBITED
7-24 Performance Tuning BI Queries

Module Review and Takeaways


In this module, you have explored the range of queries that are used within a BI solution, and the tools
that can be used to investigate the performance issue with queries. You have explored options to improve
query performance that can either include introducing indexes, or the rewriting of a query that it is more
optimized. You are now able to:

 Understand the need for performance tuning.

 Describe the BI queries required to carry out performance tuning.

 Use the tools for performance tuning.

 Remediate performance problems.


MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-25

Course Evaluation

Your evaluation of this course will help Microsoft understand the quality of your learning experience.

Please work with your training provider to access the course evaluation form.

Microsoft will keep your answers to this survey private and confidential and will use your responses to
improve your future learning experience. Your open and honest feedback is valuable and appreciated.
MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
L1-1

Module 1: Introduction to Operational Management in BI


Solutions
Lab: Introduction to Operational
Management in BI Solutions
Exercise 1: Roles in BI Operations
 Task 1: Prepare the Lab Environment
1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. On the taskbar, click the File Explorer shortcut.

3. View the contents of the D:\Labfiles\Lab01\Starter folder.


4. Right-click Setup.cmd, and then click Run as administrator.

5. In the User Account Control dialog box, click Yes, and then wait for the script to finish.

 Task 2: Review the Transcript


1. In the D:\Labfiles\Lab01\Starter folder, double-click Interviews.docx to open it in WordPad.
2. Read the interviews in the document.

 Task 3: Identify Roles to Support BI Operations


1. Form a small group with two or three other students.

2. Discuss the interviews and identify roles, responsibilities and potential employees.
3. In File Explorer, open Roles.docx in the D:\Labfiles\Lab01\Starter folder using WordPad.

4. Based on the available information, review the roles and assess the responsibilities required, and then
decide which employees should work in those roles in Roles.docx.

5. Close both documents.

Results: At the end of this exercise, you should have created a table that shows the roles required, with a
named employee who has key responsibilities.

Exercise 2: Using Team Explorer in Visual Studio


 Task 1: Connecting to a Team Foundation Server
1. On the Start screen, type Visual Studio 2017, and then press Enter.
2. On the Team menu, click Manage Connections. Wait until Team Explorer has completed its
connection to the server.
3. In the Team Explorer - Connect pane, under Connect, click Manage Connections, and then click
Connect to a Project.

4. In the team project collections box, click AdventureWorks BISolutions

5. In the Connect to a Project box, click Connect.


MCT USE ONLY. STUDENT USE PROHIBITED
L1-2 Managing SQL Business Intelligence Operations

 Task 2: Create a Team Project Within Team Foundation Server


1. In Visual Studio 2017, in the Team Explorer pane, in the Home drop-down menu, point to Projects
and My Teams, and then click New Team Project.

2. In the New Team Project on mia-sql\AdventureWorks BISolutions dialog box, on the Specify the
Team Project Settings page, in the What is the name of the team project? box, type Adventure
Works, and then click Next.

3. On the Select a Process Template page, in the Which process template should be used to create
the team project? list, click Scrum, and then click Next.

4. On the Specify Source Control Settings page, in the Choose a version control system for the
new project list, ensure Team Foundation Version Control is selected, and then click Next.

5. On the Confirm Team Project Settings page, click Finish.

6. On the Team Project Created page, click Close.

 Task 3: Create an Integration Services Project in Team Foundation Server


1. In Visual Studio 2017, on the File menu, point to New, and then click Project.

2. In the New Project dialog box, in the Templates list, click Business Intelligence, and then click
Integration Services Project.
3. In the Name box, type AWMigration, and then click Browse.

4. In the Project Location dialog box, browse to the D:\Labfiles\Lab01\Starter folder, and then click
Select Folder.
5. Ensure the Add to source control check box is checked.

6. In the Solution name box, type AWMig, and then click OK.

7. In the Add Solution AWMig to Source Control dialog box, review the settings, and then click OK.
8. In Solution Explorer, under SSIS Packages, right-click Package.dtsx, and then click Rename.

9. Type AWMig_Control.dtsx, and then press Enter.

10. Right-click the AWMig solution, and then click Check In.

11. In the Microsoft Visual Studio dialog box, click Yes to save all changes.

12. In the Team Explorer - Pending Changes pane, in the Comment box, type Control package for data
loads, and then click Check In.

13. In the Check-in Confirmation dialog box, click Yes.

14. In the Team Explorer - Pending Changes pane, verify that a message stating that the changeset
committed successfully appears.

15. In the Team Explorer - Pending Changes pane, click the Home icon.

16. Click Source Control Explorer.

17. In the Source Control Explorer tab, in the Folders section, click Adventure Works.
18. In the details section, double-click AWMig.

19. Confirm that a solution file appears in the details window with the name AWMig.sln.

20. Close Visual Studio 2017 and save all changes if prompted.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-3

Results: At the end of the exercise, you will have configured Team Explorer to connect to a TFS server
named mia-sql. You will have created a project collection and stored an Integration Services project
within the project collection in the TFS server. You will have made a change to an object and checked the
object back in to TFS. Finally, you will view the changes in Source Control Explorer.
MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
L2-1

Module 2: Configuring BI Components


Lab: Configuring BI Components
Exercise 1: Standardizing the Data Platform
 Task 1: Prepare the Lab Environment
1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. On the taskbar, click the File Explorer shortcut.

3. View the contents of the D:\Labfiles\Lab02\Starter folder.

4. Right-click Setup.cmd, and then click Run as administrator.


5. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.

 Task 2: Review the MIA-SQL Server Configuration


1. Form a small group with two or three other students.
2. Review the MIA-SQL Server configuration, focusing on the operating system, the database engine,
Analysis Services and Reporting Services.
3. In the D:\labfiles\Lab02\Starter folder, double-click PlatformReview.docx to open the file in
WordPad.

4. Fill in the area of impact and the recommended change. Aim for two recommendations per
application section.
5. Discuss the findings with the instructor and the class.

6. Close WordPad.

Results: At the end of this exercise, you should have created a table that shows which areas of the data
platform should be standardized, including:

The operating system.

The MIA-SQL database engine instance.

The MIA-SQL Analysis Services instance.

The MIA-SQL Reporting Services instance.

Exercise 2: Configuring the Operating System


 Task 1: Setting the Performance Options.
1. On the Start screen, click Control Panel.
2. In the Control Panel, click System and Security.

3. In the System and Security window, click System.

4. In the System window, under Control Panel Home, click Advanced system settings.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-2 Managing SQL Business Intelligence Operations

5. In the System Properties dialog box, on the Advanced tab, under Performance, click Settings.

6. In the Performance Options dialog box, on the Visual Effects tab, ensure Adjust for best
performance is selected.

7. On the Advanced tab, under Processor scheduling, ensure Background services is selected, and
then click OK.

8. In the System Properties dialog box, click OK.

9. Close the System window.

 Task 2: Set Lock Pages in Memory


1. On the Start page, type gpedit.msc, and then press Enter.

2. In the Local Group Policy Editor window, expand Windows Settings, expand Security Settings,
expand Local Policies, and then click User Rights Assignment.

3. In the details pane, double-click Lock pages in memory.


4. In the Lock pages in memory Properties dialog box, click Add User or Group.

5. In the Select Users, Computers, Service Accounts, or Groups dialog box, type ServiceAcct, click
Check Names, and then click OK.
6. In the Lock pages in memory Properties dialog box, click OK.

 Task 3: Set Perform Volume Maintenance Settings


1. In the details pane, double-click Perform volume maintenance tasks.

2. In the Perform volume maintenance tasks Properties dialog box, click Add User or Group.
3. In the Select Users, Computers, Service Accounts, or Groups dialog box, type ServiceAcct, click
Check Names, and then click OK.

4. In the Perform volume maintenance tasks Properties dialog box, click OK.

5. Close the Local Group Policy Editor window.

6. For this change to take effect, log out, and then log back in as ADVENTUREWORKS\Student with
the password Pa55w.rd.

Results: At the end of this exercise, you will have:

Set performance options.

Set lock pages in memory.

Set Perform Volume maintenance.


MCT USE ONLY. STUDENT USE PROHIBITED
L2-3

Exercise 3: Configuring the Database Engine


 Task 1: Modifying SQL Server Memory
1. On the taskbar, click Microsoft SQL Server Management Studio.

2. In the Connect to Server dialog box, connect to the MIA-SQL instance of the SQL Server database
engine by using Windows authentication.

3. On the toolbar, click New Query.

4. In the query editor, type the following code:

EXEC sys.sp_configure 'show advanced options', '1'


RECONFIGURE WITH OVERRIDE
GO
EXEC sys.sp_configure 'min server memory (MB)', '2048'
GO
EXEC sys.sp_configure 'max server memory (MB)', '4096'
GO
RECONFIGURE WITH OVERRIDE
GO
EXEC sys.sp_configure 'show advanced options', '0'
RECONFIGURE WITH OVERRIDE
GO

5. On the toolbar, click Execute.

 Task 2: Optimizing for Ad Hoc Workloads


1. On the toolbar, click New Query.
2. In the query editor, type the following code:

EXEC sys.sp_configure N'show advanced options', N'1'


RECONFIGURE WITH OVERRIDE
GO
EXEC sys.sp_configure N'optimize for ad hoc workloads', N'1'
GO
RECONFIGURE WITH OVERRIDE
GO
EXEC sys.sp_configure N'show advanced options', N'0'
RECONFIGURE WITH OVERRIDE
GO

3. On the toolbar, click Execute.

 Task 3: Moving tempdb Data Files


1. In Object Explorer, expand Databases, expand System Databases, right-click tempdb, and then click
Properties.

2. In the Database Properties - tempdb dialog box, under Select a page, click Files.

3. Note the number of data files present and the location of the data files, and then click Cancel.
4. On the toolbar, click New Query.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-4 Managing SQL Business Intelligence Operations

5. In the query editor, type the following code:

ALTER DATABASE tempdb


MODIFY FILE (NAME = tempdev, FILENAME = 'G:\Microsoft SQL
Server\MSSQLSERVER\Data\tempdb.mdf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = temp2, FILENAME = 'G:\Microsoft SQL
Server\MSSQLSERVER\Data\tempdb_mssql_2.ndf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = temp3, FILENAME = 'G:\Microsoft SQL
Server\MSSQLSERVER\Data\tempdb_mssql_3.ndf');
GO
ALTER DATABASE tempdb
MODIFY FILE (NAME = temp4, FILENAME = 'G:\Microsoft SQL
Server\MSSQLSERVER\Data\tempdb_mssql_4.ndf');
GO

6. On the toolbar, click Execute.

 Task 4: Moving tempdb Log Files


1. On the toolbar, click New Query.

2. In the query editor, type the following code:

ALTER DATABASE tempdb


MODIFY FILE (NAME = templog, FILENAME = 'F:\Microsoft SQL
Server\MSSQLSERVER\Logs\templog.ldf');
GO

3. On the toolbar, click Execute.


4. In Object Explorer, right-click MIA-SQL, and then click Restart.

5. In the User Account Control dialog box, click Yes.

6. In the Microsoft SQL Server Management Studio dialog box, click Yes.
7. In the Microsoft SQL Server Management Studio dialog box, click Yes.

8. In Object Explorer, right-click MIA-SQL, and then click Refresh.


9. In Object Explorer, expand Databases, expand System Databases, right-click tempdb, and then click
Properties.

10. In the Database Properties - tempdb dialog box, under Select a page, click Files.

11. Note the number of data files present and the location of the data files, and then click Cancel.

12. In Object Explorer, right-click MIA-SQL, and then click Properties.

13. In the Server Properties - MIA-SQL dialog box, under Select a page, click Advanced.

14. Under Miscellaneous, ensure that Optimize for Ad hoc Workloads is set to True.

15. Under Select a page, click Memory.

16. Ensure that the Minimum server memory (in MB) is 2048, and the Maximum server memory (in
MB) is 4096, and then click Cancel.
17. Close SQL Server Management Studio, without saving any changes.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-5

Results: At the end of this exercise, you will have:

Modified the SQL Server memory.

Configured the MIA-SQL database instance to be optimized for ad hoc workloads.

Moved tempdb data files to the G:\.


Moved the tempdb log file to the F:\.

Exercise 4: Configuring Reporting Services


 Task 1: Modifying the Reporting Services Memory
1. On the Start page, type Visual Studio 2017, right-click Visual Studio 2017, and then click Run as
administrator.

2. In the User Account Control dialog box, click Yes.

3. On the File menu, point to Open, and then click File.


4. In the Open File dialog box, browse to the C:\Program Files\Microsoft SQL Server Reporting
Services\SSRS\ReportServer folder, and then double-click rsreportserver.config.

5. In Visual Studio, press Ctrl+F.


6. In the Find dialog box, type memorythreshold, and then click Find Next.

7. In the <Service> element, under <MemoryThreshold>90</MemoryThreshold>, type the


following code:

<WorkingSetMaximum>3000000</WorkingSetMaximum>
<WorkingSetMinimum>2400000</WorkingSetMinimum>

8. On the File menu, click Save All.

9. Close Visual Studio.

Results: At the end of this exercise, you will have:

Modified the memory setting for Reporting Services.


MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
L3-1

Module 3: Managing Business Intelligence Security


Lab: Managing Business Intelligence
Security
Exercise 1: Setting Up Security in SQL Server
 Task 1: Prepare the Lab Environment
1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. On the taskbar, click the File Explorer shortcut.

3. View the contents of the D:\Labfiles\Lab03\Starter folder.

4. Right-click Setup.cmd, and then click Run as administrator.


5. Click Yes when prompted to confirm that you want to run the command file, and then wait for the
script to finish.

 Task 2: Set the SQL Server Authentication Mode on MIA-SQL to SQL Server and
Windows Authentication
1. On the taskbar, click Microsoft SQL Server Management Studio.
2. To connect to the MIA-SQL SQL Server instance, in the Connect to Server dialog box, ensure that
the Server type is Database Engine, Authentication is Windows Authentication, and then in the
Server name list select MIA-SQL.

3. Click Connect.

4. In Object Explorer, right-click MIA-SQL, and then click Properties.

5. In the Select a page pane, click Security.


6. On the Security page, under Server Authentication, click SQL Server and Windows
Authentication mode, and then click OK (it might already be set to this mode)

7. If a Microsoft SQL Server Management Studio message box pops up, click OK.

8. In Object Explorer, right-click MIA-SQL, click Restart.

9. In the User Account Control message box, click Yes.

In the Microsoft SQL Server Management Studio message box, click Yes twice.

 Task 3: Create SQL Logins for the DL_ReadSalesData Windows Groups


1. To create a login for the DL_ReadSalesData, in Object Explorer, expand the Security node.

2. Right-click Logins, and then click New Login.

3. In the Login – New dialog box, click Search.

4. In the Select User or Group dialog box, click Advanced.

5. Click the Object Types button.


6. In the Object Types dialog box, clear the Built-in security principals check box. Clear the Users
check box and clear the Other objects check box. Select the Groups check box, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-2 Managing SQL Business Intelligence Operations

7. Click the Locations button.

8. In the Locations dialog box, click Entire Directory, and then click OK.

9. In the Select User or Group dialog box, click Find Now.

10. In the Search Results box, double-click DL_ReadSalesData.

11. In the Select User or Group dialog box, click OK.

12. In the Login – New dialog box, ensure that the Windows Authentication option is selected, and
then in the Default database list, select AdventureWorks.

13. Click OK.

 Task 4: Create SQL Logins for the Sales Application


1. To create a SQL login named SalesApp, in Object Explorer, right-click Logins, and then click New
Login.

2. In the Login – New dialog box, in the Login Name text box, type SalesApp.

3. Select the option SQL Server Authentication.

4. In the Password text box, type Pa55w.rd.


5. In the Confirm Password text box, type Pa55w.rd, and then click to clear the Enforce password
policy check box.

6. In the Default database list, select AdventureWorks.

7. Click OK.

 Task 5: Create Database Users in the AdventureworksDW Database from the


Windows Group DL_ReadSalesData and Grant Select Permission to the Sales Schema.
1. In Object Explorer, expand Databases, expand EIM_Demo, and then expand Security.

2. Right-click the Users folder, and then click New User.

3. In the Database User - New dialog box, In the User type drop-down, select SQL User with Login.

4. In the User name text box, type EIM_SalesReaders.

5. Select the option Login Name and click the ellipsis (…) button.

6. In the Select Login dialog box, click Browse.

7. In the Browse for Objects dialog box, select the AdventureWorks\DL_ReadSalesData check box,
and then click OK.

8. In the Select Login dialog box, click OK.


9. Under default schema, type EDW.

10. Click OK.

11. To grant Select permission for the EIM_SalesReaders user over the EDW schema, in Object Explorer,
expand Databases, expand EIM_Demo, expand Security, and then expand Schemas.

12. Right-click EDW, and then click Properties.

13. In the Schema Properties - EDW window, under the Select a page pane, click Permissions.

14. Click Search.

15. In the Select Users or Roles dialog box, click Browse.


MCT USE ONLY. STUDENT USE PROHIBITED
L3-3

16. In the Browse for Objects dialog box, click the EIM_SalesReaders check box, and then click OK.

17. In the Select Users or Roles dialog box, click OK.

18. In the SchemaProperties - EDW window, under Permissions for EIM_SalesReaders, click the Select
check box in the Grant column.

19. Click OK.

20. Close SQL Server Management Studio.

Results: At the end of this exercise, you will have:

Set up the authentication model for a SQL Server instance.

Created a SQL login using a Windows group.


Created a SQL Server login.

Created a database user.

Mapped a database user to a built-in database role.

Exercise 2: Setting Up Security in SQL Server Analysis Services


 Task 1: Opening Up a Team Foundation Server Project
1. Start Visual Studio 2017.

2. On the Team menu, click Manage Connections.


3. In the Team Explorer – Connect pane, click Manage Connections, then click Connect to a Project.

4. In the Connect to a Project dialog box, expand mia-sql, and then click
AdventureWorksBISolutions

5. Click Connect.

6. In the Team Explorer - Home pane, right-click AW_BI.sln, then click Open.

 Task 2: Creating an Analysis Services Database Role


1. In the Solution Explorer pane, under AW_BI, expand AW_SSAS, right-click Roles, and then click New
Role.

2. In Solution Explorer, expand Roles, and click Role.role.

3. In the Properties pane, change the File Name to DB Process Role.role, and then press Enter. When
prompted, click Yes to change the object name.

4. On the General page of the Role Designer, select the Process Database and Read Definition check
boxes.
5. Click the Membership tab, and then in the Specify the users and groups for this role area, click
Add.

6. In the Select Users or Groups window, click Locations.

7. Click Entire Directory, and then click OK.

8. In the Enter the object names to select box, type GOBrien, click Check Names, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-4 Managing SQL Business Intelligence Operations

9. On the File menu, click Save All.

10. In Solution Explorer, right-click the AW_SSAS solution, and then click Deploy. If prompted to
overwrite the existing database, click Yes.

11. When the deployment has completed successfully, close the Deployment Progress window

 Task 3: Testing Analysis Services Permissions


1. On the Windows Start menu, click the user icon, and then click Sign out.

2. Log in as ADVENTUREWORKS\GOBrien with password Pa55w.rd.

3. Click Start, and type SQL Server Management Studio.

4. Click Microsoft SQL Server Management Studio 2017


5. In the Connect to Server dialog box, in the Server Type drop-down list, click Analysis Services.

6. In the Server Name drop-down list, type MIA-SQL, and then click Connect.

7. In Object Explorer, expand Databases, right-click the AW_SSAS database, and then click Process.

8. In the Object list, click Process Default in the Process Options list, and then click OK.

9. When the Process succeeded message appears, click Close.

10. In Object Explorer, expand the AW_SSAS database, right-click Roles, and then click New Role.
11. In the Create Role window, in the Role name box, type TestRole, and then click OK.

12. Click OK in the error message stating that this user does not have permissions to create new objects.

13. In the Create Role window, click Cancel to close the window.

14. Close SQL Server Management Studio.

Results: At the end of this exercise, you will have:

Created a database role and added a database user within the role.

Exercise 3: Setting Up Security in SQL Server Reporting Services


 Task 1: Create a System Level Role
1. On the Windows Start menu, click the user icon, and then click Sign out.

2. Log in as ADVENTUREWORKS\Student with password Pa55w.rd.

3. On the taskbar, click Microsoft SQL Server Management Studio.

4. In the Connect to Server dialog box, in the Server type list, ensure that Reporting Services is
selected.

5. In the Server name list, select MIA-SQL\SSRS.

6. In the Authentication list, ensure that Windows Authentication is selected, and then click Connect.

7. In the navigation pane on the left side, expand the Security folder.

8. Right-click System Roles, and then click New System Role.

9. In the New System Role dialog box, type SecurityAdmin in the Name text box.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-5

10. In the Description box, type Can set item and site security.

11. Select only the Manage report server security and Manage roles check boxes.

12. Click OK.

13. Close Microsoft SQL Server Management Studio.


14. On the taskbar, open Internet Explorer and in the address bar type mia-sql/reports_SQL2, and then
press Enter.

15. On the top menu, click the cog (settings), then click Site settings.

16. Click Security, and then click Add group or user.

17. In the Group or user name box, type AdventureWorks\GWebber.

18. Select only the SecurityAdmin role check box.

19. Click OK.

20. Close Internet Explorer.

 Task 2: Verify the System Level Role Permission


1. On the Windows Start menu, click the user icon, and then click Sign out.

2. Log in as ADVENTUREWORKS\GWebber with password Pa55w.rd.


3. Using Internet Explorer navigate to the site mia-sql/reports_SQL2. Note that a warning is displayed,
indicating that you do not have permission to view the folder. Click OK.
4. On the top menu, click the cog (settings), then click Site settings. Verify that only the branding and
security pages can be viewed on the left.

5. Close Internet Explorer.

6. On the Windows Start menu, click the user icon, and then click Sign out.

Results: At the end of this exercise, you will have:

Assigned the report security permission to Gregory Webber.

Tested the security permissions.


MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
L4-1

Module 4: Deploying BI Solutions


Lab: Deploying BI Solutions
Exercise 1: Creating a Stand-alone DACPAC
 Task 1: Prepare the Lab Environment
1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. On the taskbar, click the File Explorer shortcut.

3. View the contents of the D:\Labfiles\Lab04\Starter folder.

4. Right-click Setup.cmd, and then click Run as administrator.


5. In the User Account Control dialog box, click Yes, and then wait for the script to finish.

 Task 2: Create a DACPAC from SQL Server Management Studio


1. Start Microsoft SQL Server Management Studio.
2. In the Connect to Server dialog box, ensure that the Server type is Database Engine,
Authentication is Windows Authentication, and then in the Server name list, select MIA-SQL.

3. Click Connect.

4. In Object Explorer, expand the Databases node, expand EIM_Demo, expand Security, and expand
Users.

5. Right-click SalesReaders and click Delete.

6. In the Delete Object dialog box click OK to remove the user.


7. In Object Explorer, right-click the EIM_Demo database, point to Tasks, and click Extract Data-tier
Application.

8. In the Extract Data-tier Application window, on the Introduction page, click Next.
9. In the Set Properties page, type EIM_Demo_Test for the application name, and type 1.1.0.0 under
version.
10. Add the description DACPAC of the EIM database, and then under Save to DAC package file,
browse to the D:\Labfiles\Lab04\Starter folder, click Save and then click Next.

11. On the Validation and Summary page, click Next.

12. On the Build Package page, when the build is complete, click Finish.

13. Leave SQL Server Management Studio open.

14. On the taskbar, click the File Explorer icon.

15. In File Explorer, browse to D:\Labfiles\Lab04\Starter.

16. Confirm that a file named EIM_Demo_Test.dacpac exists in the folder.

 Task 3: Deploying a Data Tier Application


1. In SQL Server Management Studio, in Object Explorer, click Connect, and then click Database
Engine.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-2 Managing SQL Business Intelligence Operations

2. In the Connect to Server dialog box, ensure that the Server type is Database Engine,
Authentication is Windows Authentication, and then in the Server name list, select MIA-
SQL\SQL2.

3. Click Connect.

4. In Object Explorer, under MIA-SQL\SQL2, right-click the Databases node, and click Deploy Data-
tier Application.

5. In the Deploy Data-tier Application window, on the Introduction page, click Next.

6. On the Select package page, browse to D:\Labfiles\Lab04\Starter and double-click


EIM_Demo_Test.dacpac and then click Next.

7. On the Update Configuration page, click Next.

8. On the Summary page, click Next.

9. On the Deploy DAC page, when the build is complete, click Finish.

10. In Object Explorer, under MIA-SQL\SQL2, right-click the Databases node and click Refresh.

11. Expand the databases node and verify that the EIM_Demo_Test database appears.
12. Close SQL Server Management Studio.

Results: At the end of this exercise, you will have:

Created a DACPAC using SQL Server Management Studio.

Deployed a Data Tier Application.

Created a DACPAC using Visual Studio.

Validated the creation of a Data Tier Application.

Exercise 2: Managing Builds in Team Foundation Server


 Task 1: Creating a Build Agent
1. Start Visual Studio 2017

2. In Visual Studio 2017, click the Team Explorer tab and then click the Home icon.

3. Under Project, click the Builds icon.

4. In the Builds window, click New Build Definition.

5. In Internet Explorer, in the page menu bar, click Builds.

6. On the Build Definitions page, click +Agent.

7. In the Get Agent window, on the Windows tab, click Download.

8. In the Internet Explore notification bar, click Save, and then click Save As

9. In the Save As dialog box, browse to D:\Labfiles\Lab04\Starter and click Save.

10. On the Windows desktop, click File Explorer and browse to D:\Labfiles\Lab04\Starter.

11. Right-click the agent zip file and click Extract All.

12. In the extract compressed (Zipped) Folders dialog box, extract all files to the
D:\Labfiles\Lab04\Starter\agent folder.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-3

13. In File Explorer, move to the D:\Labfiles\Mod04\Starter\agent folder, right-click config.cmd and
then click Run As Administrator.

14. In the User Account Control dialog box, click Yes.

15. At the Enter Server URL prompt, type the following text, and then press Enter:

http://mia-sql:8080/tfs

16. At the Enter authentication type prompt, press Enter to accept the default value of Integrated
authentication.

17. At the line Enter agent pool prompt, press Enter to accept the default value.

18. At the Enter agent name prompt, accept the default value (MIA-SQL), and press Enter.

19. At the Enter work folder prompt, type D:\Labfiles\Lab04\Starter\agent\_work, and then press
Enter.

20. At the Enter run agent as service? prompt, type Y, and then press Enter.
21. At the Enter the user account to use for the service prompt, type AdventureWorks\ServiceAcct,
and then press Enter.

22. At the Enter Password for user account AdventureWorks\ServiceAcct prompt, type Pa55w.rd,
and then press Enter.

23. Verify that the configuration completes without any errors.

 Task 2: Creating a Build Definition


1. Return to Internet Explorer.
2. On the Build Definitions page, click + New Definition.

3. Under Select a template, click Empty process.

4. On the Tasks page, in the Name box, enter AW_BI Build definition
5. In the Agent queue drop-down list box, select Default.

6. Click Get Sources.

7. Under From, click This project, and ensure that the Repository is set to $/Adventure Works ETL

8. Click Add Task.

9. In the Add tasks pane, scroll down, click Command Line, and then click Add.

10. In the Process pane, click the Run task.

11. In the Tool box, type "C:\Program Files (x86)\Microsoft Visual


Studio\2017\Enterprise\Common7\IDE\devenv.exe".

12. In the Arguments box, type AW_BI\AW_BI.sln /Build.


13. In the toolbar, click Save & queue, and then click Save & queue.

14. In the Queue build for AW_BI Build definition window, click Queue
15. Verify that the status message Build #1 has been queued appears in the status bar near the top of
the page, and click the #1 link in this message.

16. Monitor the console output as the solution is built, and verify that the build completes without any
errors.
17. Close Internet Explorer.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-4 Managing SQL Business Intelligence Operations

18. Using File Explorer, move to the folder D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI. Verify


that this folder contains a copy of the AW_BI solution and projects. The projects should all have been
built successfully.

Results: At the end of this exercise, you will have:

Installed a build agent.

Created a build definition.


Validated a build definition.

Exercise 3: Exploring Deployment Methods


 Task 1: Manually Deploy a DACPAC
1. Start Microsoft SQL Server Management Studio.
2. In the Connect to Server dialog box, ensure that the Server type is Database Engine,
Authentication is Windows Authentication, and then in the Server name list, select MIA-SQL.

3. Click Connect.

4. In Object Explorer, under MIA-SQL, expand the Databases node, and right-click EIM_Demo, and
then click Delete.

5. In the Delete Object dialog box, select the Close existing connections check box, and then click
OK.

6. In Object Explorer, under MIA-SQL, right-click the Databases node, and click Deploy Data-tier
Application.

7. In the Deploy Data-tier Application dialog box, on the Introduction page, click Next.

8. On the Select Package page, browse to


D:\Labfiles\Lab04\Starter\SetupFiles\AW_DW\bin\Debug, and then double-click
AW_DW.dacpac.

9. On the Select Package page, click Next.

10. On the Update Configuration page, under Name type EIM_Demo, and then click Save Script.
11. In the Save Post-Deployment Script dialog box, click Documents, and then click Save.

12. On the Update Configuration page, click Next.

13. On the Summary page, click Next.

14. On the Deploy DAC page, when the build is complete, click Finish.

15. In Object Explorer, under MIA-SQL, right-click the Databases node and click Refresh. The
EIM_Demo database will appear.
16. In Object Explorer, expand the EIM_Demo database, and then expand Tables.

17. Right-click Landing.IncomingAgentsSourceA, and then click Select Top 1000 rows. Confirm that
results appear in the query window.

18. Right-click Landing.IncomingAgentsSourceB, and then click Select Top 1000 rows. Confirm that
results appear in the query window.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-5

19. Close SQL Server Management Studio.

 Task 2: Deploying an Analysis Services Database using XMLA


1. Using File Explorer, move to the C:\Program Files (x86)\Microsoft SQL
Server\140\Tools\Binn\ManagementStudio folder and double-click the file
Microsoft.AnalysisServices.Deployment.exe to start the SQL Server Analysis Services Deployment
Wizard.
2. In the Analysis Services Deployment Wizard dialog box, on the Welcome to the Analysis Services
Deployment Wizard page, click Next.
3. On the Specify Source Analysis Services Database page, in the Database file box, type
D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI\AW_SSAS\AW_SSAS.database, and then click
Next.
4. On the Installation Target page, accept the default server (localhost) and database (AW_SSAS), and
then click Next.

5. On the Specify Options for Partitions and Roles page, accept the default options, and then click
Next.
6. On the Specify Configuration Properties page, click Next.

7. On the Select Processing Options page, click Next.

8. On the Confirm Deployment page, select the Create deployment script check box, and then click
Next (overwrite the existing deployment script if prompted).

9. On the Deploying database page, wait for the deployment script to be completed, and then click
Next.
10. On the Deployment Complete page, click Finish.

11. On the toolbar, click Microsoft SQL Server Management Studio.

12. To connect to the MIA-SQL Analysis Services instance, in the Connect to Server dialog box, ensure
that the Server type is Analysis Services, and Authentication is Windows Authentication. In the
Server name list, select MIA-SQL, and then click Connect.

13. On the File menu, point to Open, and then click File.
14. Browse to the folder D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI\AW_SSAS\ and double-
click AW_SSAS Script.xmla.

15. On the Query menu, click Execute.


16. When the execution completes, expand the MIA-SQL Analysis Services instance, then expand
Databases. Confirm the presence of the AW_SSAS database.

17. Close SQL Server Management Studio.

 Task 3: Grant Remote Access to Integration Services


1. On the Start page, type run, and then press Enter.

2. In the Run dialog box, type dcomcnfg, and then press Enter.

3. In the Component Services dialog box, in the left pane, expand Component Services, expand
Computers, expand My Computer, and then expand DCOM Config.

4. Right-click Microsoft SQL Server Integration Services 14.0, and then click Properties.

5. In the Microsoft SQL Server Integration Services 14.0 Properties dialog box, on the Security tab,
in the Launch and Activation Permissions section, click Edit.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-6 Managing SQL Business Intelligence Operations

6. In the Launch and Activation Permissions dialog box, click Add.

7. In the Select Users, Computers, Service Accounts, or Groups dialog box, in the Enter the object
names to select box, type Student, click Check Names, and then click OK.

8. In the Launch and Activation Permissions dialog box, ensure that Local Launch, Remote Launch,
Local Activation, and Remote Activation are all selected for Allow, and then click OK.

9. In the Microsoft SQL Server Integration Services 14.0 Properties dialog box, in the Access
Permissions section, click Edit.

10. In the Access Permissions dialog box, click Add.


11. In the Select Users, Computers, Service Accounts, or Groups dialog box, in the Enter the object
names to select box, type Student, click Check Names, and then click OK.
12. In the Access Permissions dialog box, ensure that Local Access and Remote Access are both
selected for Allow, and then click OK.

13. In the Microsoft SQL Server Integration Services 14.0 Properties dialog box, click OK.

14. Close the Component Services window.

15. On the Start page, type SQL Server 2017 Configuration Manager, and then click SQL Server 2017
Configuration Manager.

16. In the User Account Control dialog box, click Yes.


17. In SQL Server Configration Manager, click SQL Server Services.

18. In the right pane, right-click SQL Server Integration Services 14.0, and then click Restart.

19. Wait for SQL Server Integration Services to restart, and then close SQL Server Configuration Manager.

 Task 4: Automating the Deployment of a SSIS Package


1. Right-click the Windows desktop, point to New, and then click Text Document.

2. Rename New Text Document.txt to CopySSISPackage.cmd, and then press Enter.

3. In the Rename dialog box, click Yes.

4. Right-click CopySSISPackage.cmd, and click Edit.

5. In Notepad, type the following:

dtutil /FILE
D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI\AW_SSIS\EIM_Demo_DW_Load.dtsx
/DestServer MIA-SQL /COPY SQL;EIM_Demo_DW_Load

6. On the File menu, click Save, and then close Notepad.

7. On the desktop, double-click CopySSISPackage.cmd.

8. A command prompt window will open; at the Are you sure you want to overwrite it? prompt, type
Y, and then press Enter. The command prompt will close when the command has completed.

9. On the toolbar, click Microsoft SQL Server Management Studio.

10. In the Connect to Server dialog box, ensure that the Server type is Integration Services,
Authentication is Windows Authentication, and then in the Server name list, select MIA-SQL,
then click Connect.

11. In Object Explorer, expand Stored Packages, and then expand MSDB.

12. Verify that the EIM_Demo_DW_Load package exists.


MCT USE ONLY. STUDENT USE PROHIBITED
L4-7

13. Close SQL Server Management Studio.

 Task 5: Perform a Full Deployment from Visual Studio


1. Return to Visual Studio 2017.

2. In Team Explorer, click Home.

3. Under Solutions, double-click AW_BI.sln.

4. In Solution Explorer, right-click Solution AW_BI (3 projects), and then click Deploy Solution.
5. If the Microsoft Visual Studio dialog box appears, click Yes.

6. In the Deployment Progress - AW_SSAS dialog box, when the deployment has completed, click
Close.

7. Close Visual Studio, saving any changes if prompted.

Results: At the end of this lab, you will have:

Manually deployed a DACPAC that has been part of a Team Foundation Server Build.

Used Visual Studio to manually deploy a Reporting Services project.

Execute an XMLA script from within SQL Server Management Studio.

Automate the deployment of a SSIS package.


MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
L5-1

Module 5: Logging and Monitoring in BI Operations


Lab: Monitoring BI Solutions
Exercise 1: Setting Up General Logging and Monitoring
 Task 1: Prepare the Lab Environment
1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. On the taskbar, click the File Explorer shortcut.

3. View the contents of the D:\Labfiles\Lab05\Starter folder.

4. Right-click Setup.cmd, and then click Run as administrator.


5. In the User Account Control dialog box, click Yes, and then wait for the script to finish.

 Task 2: Configuring Windows Event Logs


1. On the taskbar, click Start, type event, and then click Event Viewer.
2. Under Event Viewer (local), expand the Windows Logs node, right-click Application and click
Properties.

3. In the Log Properties – Application (Type: Administrative) dialog box, next to Maximum Log Size
(KB), type 50000.
4. Select the check box next to Archive the log when full, do not overwrite events and click Apply.

5. In the Event Viewer dialog box, read the message and then click OK.

6. Click OK to close the Log Properties – Application window.


7. Repeat the same steps for the System Log and the Security Log.

 Task 3: Configuring SQL Server Error Logs


1. Start Microsoft SQL Server Management Studio.
2. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.

3. In Object Explorer, expand the Management node, right-click SQL Server Logs, and then click
Configure.

4. In the Configure SQL Server error logs dialog box, on the General page, select the check box next
to Limit the number of error logs before they are recycled.

5. Next to Maximum number of error log files, type 14, and then click OK.

6. Close SQL Server Management Studio.

 Task 4: Configuring a Data Collector


1. Click Start, type Performance, and then click Performance Monitor.

2. To view the list of data collector sets, in the Performance Monitor window, on the left pane, click
Data Collector Sets.

3. To create a new data collector set, expand the Data Collector Sets node, right-click User Defined,
point to New, and then click Data Collector Set.
MCT USE ONLY. STUDENT USE PROHIBITED
L5-2 Managing SQL Business Intelligence Operations

4. In the Create New Data Collector Set wizard, on the How would you like to create this new data
collector set? page, in the Name box, type SQL BI Monitoring.

5. Select the Create manually (Advanced) option and then click Next.

6. On the What type of data do you want to include? page, select the Performance counter check
box, and then click Next.

7. On the Which performance counters would you like to log? page, click Add.

8. In the dialog box, in the Available counters section, expand the Processor node, scroll down, click
%Processor Time, and then click Add.
9. Repeat step 8 to add the following counters:

Object Counter Instance

Processor % privileged time _Total

Processor % user time _Total

Memory Pages/sec

Memory Available MBytes

SQL Server: Buffer Manager Buffer cache hit ratio

SQL Server: Buffer Manager Page life expectancy

LogicalDisk Average disk queue length E: F: G:

Network Interface Current bandwidth <All instnces>

SQL Server: SSIS pipeline 14.0 Buffer memory

10. When the counters are added, click OK.


11. On the Which performance counters would you like to log? page, click Next.

12. On the Where would you like the data to be saved? page, click Next.

13. On the Create the data collector set? page, ensure that Save and close is selected, and then click
Finish.

14. IN Performance Monitor, right-click SQL BI Monitoring, and then click Start. Verify that the
collector starts running.

 Task 5: Creating an SSIS Custom Log


1. Start SQL Server Management Studio.

2. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.

3. In Object Explorer, expand the Integration Services Catalogs node, right-click SSISDB, and click
Customized Logging Level.

4. In the Customized Logging Level Management dialog box, click Create.


MCT USE ONLY. STUDENT USE PROHIBITED
L5-3

5. In the Create Customized Logging Level dialog box, under Name, type Errors and Warnings, then
click OK.

6. In the Customized Logging Level Management dialog box, with Errors and Warnings highlighted,
under Configuration, click the Statistics tab.

7. Select the check box next to Component Execution Statistics.

8. Click the Events tab, select the check box next to OnWarning, and select the check box next to
OnError.

9. Click Save and then click Close.

 Task 6: Creating a SQL Server Agent Job


1. In SQL Server Management Studio, expand SQL Server Agent.

2. Right-click the Jobs node and click New Job…

3. In the New Job dialog box, in the name text box, type EIM_Demo BI Load.
4. Under select a page, click Steps, and then click New.

5. In the step name text box type Execute EIM_Demo DW Load.

6. In the Type drop-down list, select SQL Server Integration Services Package.

7. On the Package tab, in the Package source drop-down list, select SQL Server.

8. In the Server drop down list, type MIA-SQL.


9. In the Package: text box, click the ellipsis (…) icon, in the Select an SSIS Package dialog box, click
EIM_Demo_DW_Load, and then click OK.

10. In the New Job Step dialog box, click OK.

11. In the New Job dialog box, click OK.


12. In Object Explorer, under SQL Server Agent, expand Jobs, right-click EIM_Demo BI Load, and then
click Start Job at Step.

13. Verify that the job starts successfully (don't wait for it to complete; it will take a long time), and then
click Close.

Results: At the end of this lab, you will have configured:

Windows event logs.

SQL Server error logs.

A data collector.

An SSIS custom log.

A SQL Server Agent job.


MCT USE ONLY. STUDENT USE PROHIBITED
L5-4 Managing SQL Business Intelligence Operations

Exercise 2: Setting Up Targeted Logging and Monitoring


 Task 1: Creating a SQL Server Profiler Template for the Database Engine
1. On the Windows desktop, click Start, type Profiler, and then click SQL Server Profiler 17.

2. On the menu bar, click File, point to Templates, and then click New Template.

3. In the Trace Template Properties dialog box, next to Select server type, click the drop-down, and
select Microsoft SQL Server “2017”.

4. In the New template name box, type DW Query Monitoring.


5. Click the Events Selection tab, and configure the following events, leaving the column options as
default:

Category Event

Stored Procedures SP: Starting

Stored Procedures SP: Completed

Stored Procedures SP: StmtStarting

Stored Procedures SP: Stmt Completed

Stored Procedures SP: CacheHit

TSQL SQL: StmtStarting

TSQL SQL: StmtCompleted

Transactions TM: Begin Tran Starting

Transactions TM: Commit Tran Starting

Transactions TM: Rollback Tran Starting

6. Click Save.

 Task 2: Creating a SQL Server Profiler Template for Analysis Services


1. On the menu bar, click File, point to Templates, and then click New Template.

2. In the Trace Template Properties dialog box, next to Select server type, click the drop-down, and
select Microsoft SQL Server “2017” Analysis Services.

3. In the New template name box, type SSAS Query Monitoring.


MCT USE ONLY. STUDENT USE PROHIBITED
L5-5

4. Click the Events Selection tab, and configure the following events, leaving the column options as
default:

Category Event

Errors and Warnings Error

Queries Events Query begin

Query Processing Execute MDX script begin

Query Processing Query cube begin

Query Processing Get data from aggregation

Query Processing Query subcube verbose

5. Click Save.

Results: After completing this exercise, you will have created:

A SQL Server Profiler template.

An Analysis Services Profiler template.


MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
L6-1

Module 6: Troubleshooting BI Solutions


Lab: Troubleshooting BI Solutions
Exercise 1: Troubleshooting Data Warehouse Loads
 Task 1: Prepare the Lab Environment
1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. On the taskbar, click the File Explorer shortcut.

3. View the contents of the D:\Labfiles\Lab06\Starter folder.

4. Right-click Setup.cmd, and then click Run as administrator.


5. In the User Account Control dialog box, click Yes, and then wait for the script to finish.

6. Start SQL Server Management Studio as administrator.

7. In the User Account Control dialog box, click Yes.

8. In the Connect to Server dialog box, set the Server type to Analysis Services, the Server name to
MIA-SQL, the Authentication to Windows Authentication, and then click Connect.

9. In Object Explorer, expand the Databases folder. If the AW_SSAS database exists, complete the
following steps to delete it:

a. Right-click the AW_SSAS database, and then click Delete.

b. In the Delete Objects dialog box, click OK.


10. Right-click Databases, and then click Restore.

11. In the Restore Database dialog box, in the Backup file box type D:\Setupfiles\AW_SSAS.abf, and
then click OK.

12. In Object Explorer, right-click Databases, and then click Refresh.

13. Expand Databases, and verify that the AW_SSAS database appears.

14. Close SQL Server Management Studio

 Task 2: Using the Appropriate Logging and Monitoring Tools


1. In the D:\Labfiles\Lab06\Starter folder, double-click the MIA-SQL_Unresponsive.jpg file and read
the email message.
2. Discuss with your partner the appropriate logging and monitoring tools to help in identifying the
problem.

3. List the tools you will use to identify the issue ready for a discussion at the end of the lab.

 Task 3: Collecting the Evidence


1. For the data warehouse, execute the tools that you have listed, collecting the information required.
2. Review the output of your data collection.

3. Determine the cause of the issue, backed with evidence from the monitoring, and determine how to
fix the issue.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-2 Managing SQL Business Intelligence Operations

 Task 4: Applying the Fix


1. Start Microsoft SQL Server Management Studio.

2. In the Connect to Server dialog box, in the Server type list, click Database Engine, in the Server
name list, ensure that MIA-SQL is selected, and then click Connect.

3. In Object Explorer, expand SQL Server Agent and then double-click Job Activity Monitor.
4. In the Job Activity Monitor - MIA-SQL dialog box, under Agent Job Activity, right-click
EIM_Demo BI Load, and then click Stop Job.

5. In the Job Activity Monitor - MIA-SQL dialog box, click Close.

6. Close Microsoft SQL Server Management Studio.

7. Start Microsoft SQL Server Management Studio as administrator.

8. In the User Account Control dialog box, click Yes.

9. In the Connect to Server dialog box, in the Server type list, click Integration Services, ensure the
Server name box is set to MIA-SQL, then click Connect.

10. In Object Explorer, expand Stored Packages, expand MSDB, right-click the EIM_Demo_DW_Load
package, and then click Export Package.

11. In the Export Package dialog box, click the ellipses next to Package path.

12. In the Save Package To File dialog box, navigate to D:\Labfiles\Lab06\Starter, then click Save.

13. In the Export Package dialog box, click OK.


14. Close Microsoft SQL Server Management Studio.

15. Start Visual Studio 2017.

16. On the File menu, point to New, and then click Project.
17. In the New Project dialog box, click Integration Services Project.

18. Clear the Create directory for solution and Add to Source Control check boxes.

19. In the Name box, type Exercise 1 Solution.

20. In the Location box, type D:\Labfiles\Lab06\Starter\, and then click OK.

21. In Solution Explorer, right-click the SSIS Packages folder, and then click Add Existing Package.

22. In the Add Copy of Existing Package dialog box, click the ellipses next to Package path.
23. In the Load Package dialog box, navigate to D:\Labfiles\Lab06\Starter, and then double-click
EIM_Demo_DW_load.dtsx.

24. In the Add Copy of Existing Package dialog box, click OK.

25. In Solution Explorer, right-click the EIM_Demo_DW_Load.dtsx package, and then click Open.

26. Scroll down until the Truncate Tables step is visible, right-click this step, and then click Edit.

27. In the Execute SQL Task Editor dialog box, in the SQL Statement field, to the right of the field, click
the ellipses.

28. In the Enter SQL Query dialog box, delete all the text below the line --Test code, and then click OK.

29. In the Execute SQL Task Editor dialog box, click OK.

30. On the File menu, click Save All and then close Visual Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-3

31. Start Microsoft SQL Server Management Studio as administrator.

32. In the User Account Control dialog box, click Yes.

33. In the Connect to Server dialog box, in the Server type list, click Integration Services, ensure the
Server name box is set to MIA-SQL, then click Connect.

34. In Object Explorer, expand Stored Packages, expand MSDB, right-click the EIM_Demo_DW_Load
package, and then click Import Package.

35. In the Import Package dialog box, in the Package path box, type
D:\Labfiles\Lab06\Starter\Exercise 1 Solution\EIM_Demo_DW_Load.dtsx.

Note: Make sure that you select the package in the Exercise 1 Solution folder, and not the
Starter folder.

36. In the Package name box, verify that the text is EIM_Demo_DW_Load, and then click OK.

37. In the Import Package dialog box, click Yes to overwrite the existing package.

38. In Object Explorer, click Connect, and then click Database Engine.
39. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.

40. On the File menu, point to Open, then click File.

41. In the Open File dialog box, in the File name box, type
D:\Labfiles\Lab06\Starter\BI_LoadReset.sql, and then click Open.

42. Click Execute.

43. In File Explorer, navigate to D:\Labfiles\Lab06\Starter, right-click StartAgentJob.cmd, and then


click Run as administrator.

44. In the User Account Control dialog box, click Yes.

45. In Microsoft SQL Server Management Studio, in Object Explorer, under MIA-SQL, expand SQL Server
Agent, and then double-click Job Activity Monitor.

46. In the Job Activity Monitor - MIA-SQL dialog box, click Refresh until the EIM_Demo BI Load
status changes from Executing to Idle. This can take several minutes. Note that the job completes
with a data error, which you will resolve in the next exercise.

47. In the Job Activity Monitor - MIA-SQL dialog box, click Close.

48. Close Microsoft SQL Server Management Studio.

Results: After completing this exercise, you will have:

Used the appropriate logging and monitoring tools to identify the issue.

Resolved the unresponsive nature of the BI solution with a permanent fix.


MCT USE ONLY. STUDENT USE PROHIBITED
L6-4 Managing SQL Business Intelligence Operations

Exercise 2: Troubleshooting SQL Server Analysis Services


 Task 1: Using the Appropriate Logging and Monitoring Tools
1. For Analysis Services, execute the tools that you have listed, collecting the information required.

2. Review the output of your data collection.

3. Determine the cause of the issue, backed with evidence from the monitoring, and determine how to
fix the issue.

 Task 2: Diagnose the Issue


1. Start Microsoft SQL Server Management Studio as administrator.

2. In the User Account Control dialog box, click Yes.

3. In the Connect to Server dialog box, in the Server type list, click Analysis Services.

4. In the Server name list, ensure that MIA-SQL is selected, and then click Connect.
5. In Object Explorer, expand Databases, expand AW_SSAS, expand Dimensions, right-click the
Agents dimension, and then click Process.

6. In the Process Dimension - Agents dialog box, click OK.

7. In the Process Progress dialog box, click Close.

8. In Object Explorer, right-click the Customers dimension, and then click Process.

9. In the Process Dimension - Customers dialog box, click OK.

10. In the Process Progress dialog box, click Close.


11. In Object Explorer, right-click the Date dimension, and then click Process.

12. In the Process Dimension - Date dialog box, click OK.

13. In the Process Progress dialog box, click Close.


14. In Object Explorer, right-click the Policy Event dimension, and then click Process.

15. In the Process Dimension - Policy Event dialog box, click OK.

16. In the Process Progress dialog box, click Close.

17. In Object Explorer under AW_SSAS, expand Cubes, right-click EIM Demo, and then click Process.

18. In the Process Cube - EIM demo dialog box, click OK.

19. In the Process Progress dialog box, note that the cube fails to process due to a data issue

20. In the Process Progress dialog box, expand Command, and then expand each node.

21. Click the bottom-most error message, and then click View Details.

22. In the View Details dialog box, note that an attribute key -1 cannot be found for the AgentCode,
and then click Close.

23. In the Process Progress dialog box, click Close.

24. Close the Process Cube - EIM Demo dialog box.


MCT USE ONLY. STUDENT USE PROHIBITED
L6-5

 Task 3: Applying the Fix


1. In Object Explorer, click Connect, and then click Database Engine.

2. In the Connect to Server dialog box, in the Server name list, select MIA-SQL, and then click
Connect.

3. Expand Databases, expand EIM_Demo, expand Tables, right-click EDW.DimAgents, and then click
Edit Top 200 Rows.

4. In the MIA-SQL.EIM_Demo - EDW.DimAgents window, click the NULL row at the end of the list.

5. In the BrokerID column, type -1, in the Broker Name column, type Not Found, and then press
Enter.
6. In Object Explorer, in the AW_SSAS database, under Dimensions, right-click the Agents dimension,
and then click Process.

7. In the Process Dimension - Agents dialog box, click OK.

8. In the Process Progress dialog box, click Close.

9. Right-click the EIM Demo cube, and then click Process.

10. In the Process Cube - EIM Demo dialog box, click OK. Note that another error occurs.
11. In the Process Progress dialog box, expand Command, and then expand each node.

12. Click the bottom-most error message, and then click View Details.
13. In the View Details dialog box, note that an attribute key -1 cannot be found for the
CustomerCode, and then click Close.

14. In the Process Progress dialog box, click Close.

15. Close the Process Cube – EIM Demo dialog box.

16. In Object Explorer, right-click EDW.DimCustomers, and then click Edit Top 200 Rows.

17. Scroll to the NULL row at the end of the list.


18. In the CustomerID column, type -1, in the First Name column, type Not Found, and then press
Enter.

19. In Object Explorer, right-click the Customers dimension, and then click Process.

20. In the Process Dimension - Customers dialog box, click OK.


21. In the Process Progress dialog box, click Close.

22. Right-click the EIM Demo cube, and then click Process.

23. In the Process Cube - EIM Demo dialog box, click OK. Note that the cube is now processed
successfully.

24. In the Process Progress dialog box, click Close.

25. Close SQL Server Management Studio.

Results: After completing this exercise, you will have:

Used the appropriate logging and monitoring tools to identify the issue.

Resolved the unresponsive nature of the BI solution with a permanent fix.


MCT USE ONLY. STUDENT USE PROHIBITED
 
MCT USE ONLY. STUDENT USE PROHIBITED
L7-1

Module 7: Performance Tuning BI Queries


Lab: Performance Tuning a BI Solution
Exercise 1: Performance Tuning BI Queries
 Task 1: Prepare the Lab Environment
1. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.

2. In File Explorer, view the contents of the D:\Labfiles\Lab07\Starter folder.

3. Right-click Setup.cmd, and then click Run as administrator.

4. In the User Account Control dialog box, click Yes, and then wait for the script to finish.

 Task 2: Analyzing Queries with Execution Plans


1. Start Microsoft SQL Server Management Studio.

2. In the Connect to Server dialog box, ensure that the Server type is Database Engine, the Server
name is MIA-SQL, and the Authentication is Windows Authentication, and then, click Connect.
3. On the File menu, point to Open, and then click File.

4. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query1.sql.
5. On the Query menu, click Include Actual Execution Plan.

6. Click Execute.
7. Review the execution plan, making a note of what part of the query execution could be improved for
optimal execution of the query.

8. On the File menu, point to Open, and then click File.


9. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query2.sql.

10. On the Query menu, click Include Actual Execution Plan.

11. Click Execute.

12. Review the execution plan, making a note of what part of the query execution could be improved for
optimal execution of the query.

13. On the File menu, point to Open, and then click File.

14. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query3.sql.

15. On the Query menu, click Include Actual Execution Plan.

16. Click Execute.

17. Review the execution plan, making a note of what part of the query execution could be improved for
optimal execution of the query

 Task 3: Identify Performance Issue with BI Queries


1. Form a small group with two or three other students.

2. Discuss the findings you can identify from the execution plans.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-2 Managing SQL Business Intelligence Operations

3. In File Explorer, open Queries.docx in the D:\Labfiles\Lab07\Starter folder.

4. Based on the available information, identify the main issue with each query and determine whether to
refactor or use indexes to fix the query.

5. Close the Word document.

6. In SQL Server Management Studio, close Query1.sql, Query2.sql and Query3.sql .

Results: At the end of this exercise, you will be able to:

Trace the actual execution plan of queries.

Use this information to identify performance issues with queries.

Exercise 2: Exploring SQL Server Query Store


 Task 1: View the Overall Resource Consumption Report
1. In SQL Server Management Studio, in Object Explorer, expand Databases, expand Adventureworks,
and then expand Query Store.

2. Double-click the Overall Resource Consumption report.


3. In the report toolbar, click Configure.

4. In the Configure Overall Resource Consumption window, in the Time Interval section, click Last
hour, and then click OK.
5. Note that the report contains four reports that show:

a. Duration

b. Execution Count
c. CPU Time

d. Logical Reads

6. In the Duration report, position the cursor on the highest bar in the bar chart.

7. Note that a tooltip appears that provides the query execution information, including:

a. Interval Start

b. Interval End

c. CPU Time (ms)

d. Duration (ms)

e. Logical Writes
f. Logical Reads

g. Memory Consumption (KB)

h. Physical Reads

i. Execution Count

8. Close the Overall Resource Consumption report.


MCT USE ONLY. STUDENT USE PROHIBITED
L7-3

 Task 2: View the Top Resource Consuming Queries Report


1. Double-click the Top Resource Consuming Queries report.

2. In the Top Resource Consuming Queries window, notice that there is an execution plan in the
bottom part of the report showing the plan for the longest-running query.

3. At the top left, notice that a bar chart is displayed showing queries of the longest duration.

4. Find the query with the highest duration in the bar chart, and click it.
5. Note the query number from the plan summary on the top right for use in the next task.

6. At the top right of the Top Resource Consuming Queries window, observe the chart for Plan
Summary. Note that the query is not using a forced plan.

7. Under the chart for the plan summary, click the Force Plan button.

8. In the Confirmation dialog box, click Yes.

9. Close the Top Resource Consuming Queries window.

 Task 3: View the Tracked Queries Report


1. Double-click the Tracked Queries report.

2. At the top left, in the Tracked Queries box, type the query number from the previous task, and then
click the play button.
3. A chart appears showing the execution of the query. There is also the ability to force and unforce a
plan for the query.

4. Close the Tracked Queries report.

Results: At the end of this exercise, you will be able to:

View the Overall Resource Consumption report.

View the Top Resource Consuming queries.

Unforce an execution plan for a query.

Track a query with the Tracked Queries Report.

Exercise 3: Remediating Performance Issues


 Task 1: Rewrite Query1 for Better Performance
1. On the File menu, point to Open, and then click File.
2. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query1.sql.

3. Alter the existing code to match the following:

SELECT Name, ProductNumber, ListPrice


FROM Production.Product
WHERE ProductSubCategoryID = 1

4. On the Query menu, click Include Actual Execution Plan.

5. Click Execute.

6. Review the execution plan, making a note of improvements to the query execution plan.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-4 Managing SQL Business Intelligence Operations

 Task 2: Rewrite Query2 for Better Performance


1. On the File menu, point to Open, and then click File….

2. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query2.sql.

3. Alter the existing code to match the following:

DECLARE @Name nvarchar(50)


SET @Name = 'll Crankarm'
SELECT Name, ProductNumber,
ListPrice, SafetyStockLevel
FROM Production.Product
WHERE SafetyStockLevel > 500 OR
[Name] = @Name

4. On the Query menu, click Include Actual Execution Plan.

5. Click Execute.

6. Review the execution plan, making a note of the improvements made.

 Task 3: Using Indexes to Improve Query Performance


1. In SQL Server Management Studio, on the menu bar, click New Query.
2. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query3.sql.

3. Below the existing code, add the following:

ALTER TABLE [Sales].[SalesOrderDetail] ADD CONSTRAINT


[PK_SalesOrderDetail_SalesOrderID_SalesOrderDetailID] PRIMARY KEY CLUSTERED
(
[SalesOrderID] ASC,
[SalesOrderDetailID] ASC
)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, SORT_IN_TEMPDB = OFF,
IGNORE_DUP_KEY = OFF, ONLINE = OFF, ALLOW_ROW_LOCKS = ON, ALLOW_PAGE_LOCKS = ON) ON
[PRIMARY]
GO

4. Select the Create Index statement and then click Execute.


5. On the Query menu, click Include Actual Execution Plan.

6. Highlight the SELECT statement at the top of the query window, and then click Execute.

7. Review the query execution plan, making a note of the improvements made.
8. Close SQL Server Management Studio.

Results: At the end of this exercise, you will have:

Rewritten a query for better performance.

Added indexes to improve query performance.

You might also like