10988C ENU TrainerHandbook PDF
10988C ENU TrainerHandbook PDF
10988C
Managing SQL Business Intelligence
Operations
MCT USE ONLY. STUDENT USE PROHIBITED
ii Managing SQL Business Intelligence Operations
Information in this document, including URL and other Internet Web site references, is subject to change
without notice. Unless otherwise noted, the example companies, organizations, products, domain names,
e-mail addresses, logos, people, places, and events depicted herein are fictitious, and no association with
any real company, organization, product, domain name, e-mail address, logo, person, place or event is
intended or should be inferred. Complying with all applicable copyright laws is the responsibility of the
user. Without limiting the rights under copyright, no part of this document may be reproduced, stored in
or introduced into a retrieval system, or transmitted in any form or by any means (electronic, mechanical,
photocopying, recording, or otherwise), or for any purpose, without the express written permission of
Microsoft Corporation.
Microsoft may have patents, patent applications, trademarks, copyrights, or other intellectual property
rights covering subject matter in this document. Except as expressly provided in any written license
agreement from Microsoft, the furnishing of this document does not give you any license to these
patents, trademarks, copyrights, or other intellectual property.
The names of manufacturers, products, or URLs are provided for informational purposes only and
Microsoft makes no representations and warranties, either expressed, implied, or statutory, regarding
these manufacturers or the use of the products with any Microsoft technologies. The inclusion of a
manufacturer or product does not imply endorsement of Microsoft of the manufacturer or product. Links
may be provided to third party sites. Such sites are not under the control of Microsoft and Microsoft is not
responsible for the contents of any linked site or any link contained in a linked site, or any changes or
updates to such sites. Microsoft is not responsible for webcasting or any other form of transmission
received from any linked site. Microsoft is providing these links to you only as a convenience, and the
inclusion of any link does not imply endorsement of Microsoft of the site or the products contained
therein.
© 2018 Microsoft Corporation. All rights reserved.
Released: 02/2018
MCT USE ONLY. STUDENT USE PROHIBITED
MICROSOFT LICENSE TERMS
MICROSOFT INSTRUCTOR-LED COURSEWARE
These license terms are an agreement between Microsoft Corporation (or based on where you live, one of its
affiliates) and you. Please read them. They apply to your use of the content accompanying this agreement which
includes the media on which you received it, if any. These license terms also apply to Trainer Content and any
updates and supplements for the Licensed Content unless other terms accompany those items. If so, those terms
apply.
BY ACCESSING, DOWNLOADING OR USING THE LICENSED CONTENT, YOU ACCEPT THESE TERMS.
IF YOU DO NOT ACCEPT THEM, DO NOT ACCESS, DOWNLOAD OR USE THE LICENSED CONTENT.
If you comply with these license terms, you have the rights below for each license you acquire.
1. DEFINITIONS.
a. “Authorized Learning Center” means a Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, or such other entity as Microsoft may designate from time to time.
b. “Authorized Training Session” means the instructor-led training class using Microsoft Instructor-Led
Courseware conducted by a Trainer at or through an Authorized Learning Center.
c. “Classroom Device” means one (1) dedicated, secure computer that an Authorized Learning Center owns
or controls that is located at an Authorized Learning Center’s training facilities that meets or exceeds the
hardware level specified for the particular Microsoft Instructor-Led Courseware.
d. “End User” means an individual who is (i) duly enrolled in and attending an Authorized Training Session
or Private Training Session, (ii) an employee of a MPN Member, or (iii) a Microsoft full-time employee.
e. “Licensed Content” means the content accompanying this agreement which may include the Microsoft
Instructor-Led Courseware or Trainer Content.
f. “Microsoft Certified Trainer” or “MCT” means an individual who is (i) engaged to teach a training session
to End Users on behalf of an Authorized Learning Center or MPN Member, and (ii) currently certified as a
Microsoft Certified Trainer under the Microsoft Certification Program.
g. “Microsoft Instructor-Led Courseware” means the Microsoft-branded instructor-led training course that
educates IT professionals and developers on Microsoft technologies. A Microsoft Instructor-Led
Courseware title may be branded as MOC, Microsoft Dynamics or Microsoft Business Group courseware.
h. “Microsoft IT Academy Program Member” means an active member of the Microsoft IT Academy
Program.
i. “Microsoft Learning Competency Member” means an active member of the Microsoft Partner Network
program in good standing that currently holds the Learning Competency status.
j. “MOC” means the “Official Microsoft Learning Product” instructor-led courseware known as Microsoft
Official Course that educates IT professionals and developers on Microsoft technologies.
k. “MPN Member” means an active Microsoft Partner Network program member in good standing.
MCT USE ONLY. STUDENT USE PROHIBITED
l. “Personal Device” means one (1) personal computer, device, workstation or other digital electronic device
that you personally own or control that meets or exceeds the hardware level specified for the particular
Microsoft Instructor-Led Courseware.
m. “Private Training Session” means the instructor-led training classes provided by MPN Members for
corporate customers to teach a predefined learning objective using Microsoft Instructor-Led Courseware.
These classes are not advertised or promoted to the general public and class attendance is restricted to
individuals employed by or contracted by the corporate customer.
n. “Trainer” means (i) an academically accredited educator engaged by a Microsoft IT Academy Program
Member to teach an Authorized Training Session, and/or (ii) a MCT.
o. “Trainer Content” means the trainer version of the Microsoft Instructor-Led Courseware and additional
supplemental content designated solely for Trainers’ use to teach a training session using the Microsoft
Instructor-Led Courseware. Trainer Content may include Microsoft PowerPoint presentations, trainer
preparation guide, train the trainer materials, Microsoft One Note packs, classroom setup guide and Pre-
release course feedback form. To clarify, Trainer Content does not include any software, virtual hard
disks or virtual machines.
2. USE RIGHTS. The Licensed Content is licensed not sold. The Licensed Content is licensed on a one copy
per user basis, such that you must acquire a license for each individual that accesses or uses the Licensed
Content.
2.1 Below are five separate sets of use rights. Only one set of rights apply to you.
2.2 Separation of Components. The Licensed Content is licensed as a single unit and you may not
separate their components and install them on different devices.
2.3 Redistribution of Licensed Content. Except as expressly provided in the use rights above, you may
not distribute any Licensed Content or any portion thereof (including any permitted modifications) to any
third parties without the express written permission of Microsoft.
2.4 Third Party Notices. The Licensed Content may include third party code tent that Microsoft, not the
third party, licenses to you under this agreement. Notices, if any, for the third party code ntent are included
for your information only.
2.5 Additional Terms. Some Licensed Content may contain components with additional terms,
conditions, and licenses regarding its use. Any non-conflicting terms in those conditions and licenses also
apply to your use of that respective component and supplements the terms described in this agreement.
a. Pre-Release Licensed Content. This Licensed Content subject matter is on the Pre-release version of
the Microsoft technology. The technology may not work the way a final version of the technology will
and we may change the technology for the final version. We also may not release a final version.
Licensed Content based on the final version of the technology may not contain the same information as
the Licensed Content based on the Pre-release version. Microsoft is under no obligation to provide you
with any further content, including any Licensed Content based on the final version of the technology.
b. Feedback. If you agree to give feedback about the Licensed Content to Microsoft, either directly or
through its third party designee, you give to Microsoft without charge, the right to use, share and
commercialize your feedback in any way and for any purpose. You also give to third parties, without
charge, any patent rights needed for their products, technologies and services to use or interface with
any specific parts of a Microsoft technology, Microsoft product, or service that includes the feedback.
You will not give feedback that is subject to a license that requires Microsoft to license its technology,
technologies, or products to third parties because we include your feedback in them. These rights
survive this agreement.
c. Pre-release Term. If you are an Microsoft IT Academy Program Member, Microsoft Learning
Competency Member, MPN Member or Trainer, you will cease using all copies of the Licensed Content on
the Pre-release technology upon (i) the date which Microsoft informs you is the end date for using the
Licensed Content on the Pre-release technology, or (ii) sixty (60) days after the commercial release of the
technology that is the subject of the Licensed Content, whichever is earliest (“Pre-release term”).
Upon expiration or termination of the Pre-release term, you will irretrievably delete and destroy all copies
of the Licensed Content in your possession or under your control.
MCT USE ONLY. STUDENT USE PROHIBITED
4. SCOPE OF LICENSE. The Licensed Content is licensed, not sold. This agreement only gives you some
rights to use the Licensed Content. Microsoft reserves all other rights. Unless applicable law gives you more
rights despite this limitation, you may use the Licensed Content only as expressly permitted in this
agreement. In doing so, you must comply with any technical limitations in the Licensed Content that only
allows you to use it in certain ways. Except as expressly permitted in this agreement, you may not:
• access or allow any individual to access the Licensed Content if they have not acquired a valid license
for the Licensed Content,
• alter, remove or obscure any copyright or other protective notices (including watermarks), branding
or identifications contained in the Licensed Content,
• modify or create a derivative work of any Licensed Content,
• publicly display, or make the Licensed Content available for others to access or use,
• copy, print, install, sell, publish, transmit, lend, adapt, reuse, link to or post, make available or
distribute the Licensed Content to any third party,
• work around any technical limitations in the Licensed Content, or
• reverse engineer, decompile, remove or otherwise thwart any protections or disassemble the
Licensed Content except and only to the extent that applicable law expressly permits, despite this
limitation.
5. RESERVATION OF RIGHTS AND OWNERSHIP. Microsoft reserves all rights not expressly granted to
you in this agreement. The Licensed Content is protected by copyright and other intellectual property laws
and treaties. Microsoft or its suppliers own the title, copyright, and other intellectual property rights in the
Licensed Content.
6. EXPORT RESTRICTIONS. The Licensed Content is subject to United States export laws and regulations.
You must comply with all domestic and international export laws and regulations that apply to the Licensed
Content. These laws include restrictions on destinations, end users and end use. For additional information,
see www.microsoft.com/exporting.
7. SUPPORT SERVICES. Because the Licensed Content is “as is”, we may not provide support services for it.
8. TERMINATION. Without prejudice to any other rights, Microsoft may terminate this agreement if you fail
to comply with the terms and conditions of this agreement. Upon termination of this agreement for any
reason, you will immediately stop all use of and delete and destroy all copies of the Licensed Content in
your possession or under your control.
9. LINKS TO THIRD PARTY SITES. You may link to third party sites through the use of the Licensed
Content. The third party sites are not under the control of Microsoft, and Microsoft is not responsible for
the contents of any third party sites, any links contained in third party sites, or any changes or updates to
third party sites. Microsoft is not responsible for webcasting or any other form of transmission received
from any third party sites. Microsoft is providing these links to third party sites to you only as a
convenience, and the inclusion of any link does not imply an endorsement by Microsoft of the third party
site.
10. ENTIRE AGREEMENT. This agreement, and any additional terms for the Trainer Content, updates and
supplements are the entire agreement for the Licensed Content, updates and supplements.
12. LEGAL EFFECT. This agreement describes certain legal rights. You may have other rights under the laws
of your country. You may also have rights with respect to the party from whom you acquired the Licensed
Content. This agreement does not change your rights under the laws of your country if the laws of your
country do not permit it to do so.
13. DISCLAIMER OF WARRANTY. THE LICENSED CONTENT IS LICENSED "AS-IS" AND "AS
AVAILABLE." YOU BEAR THE RISK OF USING IT. MICROSOFT AND ITS RESPECTIVE
AFFILIATES GIVES NO EXPRESS WARRANTIES, GUARANTEES, OR CONDITIONS. YOU MAY
HAVE ADDITIONAL CONSUMER RIGHTS UNDER YOUR LOCAL LAWS WHICH THIS AGREEMENT
CANNOT CHANGE. TO THE EXTENT PERMITTED UNDER YOUR LOCAL LAWS, MICROSOFT AND
ITS RESPECTIVE AFFILIATES EXCLUDES ANY IMPLIED WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NON-INFRINGEMENT.
14. LIMITATION ON AND EXCLUSION OF REMEDIES AND DAMAGES. YOU CAN RECOVER FROM
MICROSOFT, ITS RESPECTIVE AFFILIATES AND ITS SUPPLIERS ONLY DIRECT DAMAGES UP
TO US$5.00. YOU CANNOT RECOVER ANY OTHER DAMAGES, INCLUDING CONSEQUENTIAL,
LOST PROFITS, SPECIAL, INDIRECT OR INCIDENTAL DAMAGES.
It also applies even if Microsoft knew or should have known about the possibility of the damages. The
above limitation or exclusion may not apply to you because your country may not allow the exclusion or
limitation of incidental, consequential or other damages.
Please note: As this Licensed Content is distributed in Quebec, Canada, some of the clauses in this
agreement are provided below in French.
Remarque : Ce le contenu sous licence étant distribué au Québec, Canada, certaines des clauses
dans ce contrat sont fournies ci-dessous en français.
EXONÉRATION DE GARANTIE. Le contenu sous licence visé par une licence est offert « tel quel ». Toute
utilisation de ce contenu sous licence est à votre seule risque et péril. Microsoft n’accorde aucune autre garantie
expresse. Vous pouvez bénéficier de droits additionnels en vertu du droit local sur la protection dues
consommateurs, que ce contrat ne peut modifier. La ou elles sont permises par le droit locale, les garanties
implicites de qualité marchande, d’adéquation à un usage particulier et d’absence de contrefaçon sont exclues.
EFFET JURIDIQUE. Le présent contrat décrit certains droits juridiques. Vous pourriez avoir d’autres droits
prévus par les lois de votre pays. Le présent contrat ne modifie pas les droits que vous confèrent les lois de votre
pays si celles-ci ne le permettent pas.
Acknowledgements
Microsoft Learning would like to acknowledge and thank the following for their contribution towards
developing this title. Their effort at various stages in the development has ensured that you have a good
classroom experience.
Contents
Module 1: Introduction to Operational Management in BI Solutions
Module Overview 1-1
Lesson 1: Rationale for BI Operations 1-2
Course Description
Note: This course is released on the SQL Server 2017, and supersedes the B version based
on SQL Server 2016.
This three-day instructor-led course is aimed at database professionals who manage Business Intelligence
(BI) operations. This course looks at various options that provide the ability of business users to analyze
data and share their findings, starting with managed BI data sources and expanding to personal and
external/public data sources.
Audience
The primary audience for this course are business intelligence professionals.
The secondary audiences for this course are technically proficient business users.
Student Prerequisites
This course requires that you meet the following prerequisites:
Basic knowledge of the Microsoft Windows operating system and its core functionality.
Working knowledge of database administration and maintenance
Course Objectives
After completing this course, students will be able to:
Course Outline
The course outline is as follows:
Module 4, “Deploying BI Solutions” covers deploying BI solutions as part of the BI deployment lifecycle.
You are introduced to a number of tools and practices that can be used.
Module 5, “Logging and Monitoring in BI Operations” covers tools and practices to help the operations
team ensure the continued service of key applications that are used within the business.
Module 6, “Troubleshooting BI Solutions”. The task of trying to troubleshoot failed BI solutions can be
complex. It requires an understanding of the environments in which the BI solution is hosted, and an
understanding of the workloads that take place during the life cycle of the solution. Troubleshooting can
be made easier if the BI operations team has established defined standards for different tiers of servers for
the configuration, security, and deployment of the solution. Standards create a baseline environment for
the servers and the solution so that the BI operations team have a clear understanding of the environment
that they are troubleshooting.
Module 7, “Performance Tuning BI Queries” covers the BI Operations team working with the
development team to performance tune queries.
Course Materials
The following materials are included with your kit:
Course Handbook: a succinct classroom learning guide that provides the critical technical
information in a crisp, tightly-focused format, which is essential for an effective in-class learning
experience.
o Lessons: guide you through the learning objectives and provide the key points that are critical to
the success of the in-class learning experience.
o Labs: provide a real-world, hands-on platform for you to apply the knowledge and skills learned
in the module.
o Module Reviews and Takeaways: provide on-the-job reference material to boost knowledge
and skills retention.
o Lab Answer Keys: provide step-by-step lab solution guidance.
Modules: include companion content, such as questions and answers, detailed demo steps and
additional reading links, for each lesson. Additionally, they include Lab Review questions and answers
and Module Reviews and Takeaways sections, which contain the review questions and answers, best
practices, common issues and troubleshooting tips with answers, and real-world issues and scenarios
with answers.
Resources: include well-categorized additional resources that give you immediate access to the most
current premium content on TechNet, MSDN®, or Microsoft® Press®.
Course evaluation: at the end of the course, you will have the opportunity to complete an online
evaluation to provide feedback on the course, training facility, and instructor.
o To provide additional comments or feedback on the course, send an email to
[email protected]. To inquire about the Microsoft Certification Program, send an email to
[email protected].
Note: At the end of each lab, you must revert the virtual machines to a snapshot. You can
find the instructions for this procedure at the end of each lab
The following table shows the role of each virtual machine that is used in this course:
Software Configuration
The following software is installed on the VMs:
Course Files
The files associated with the labs in this course are located in the D:\Labfiles folder on the 10988C-MIA-
SQL virtual machine.
Classroom Setup
Each classroom computer will have the same virtual machine configured in the same way.
MCT USE ONLY. STUDENT USE PROHIBITED
iv About This Course
Hard Disk: Dual 120 GB hard disks 7200 RM SATA or better (Striped)
Additionally, the instructor’s computer must be connected to a projection display device that supports
SVGA 1024×768 pixels, 16-bit colors.
MCT USE ONLY. STUDENT USE PROHIBITED
1-1
Module 1
Introduction to Operational Management in BI Solutions
Contents:
Module Overview 1-1
Lesson 1: Rationale for BI Operations 1-2
Module Overview
Operational management of BI solutions is on the increase. An organization’s need for information,
coupled with the timely delivery of this information, means that IT departments are placing as much
emphasis on operational frameworks to support BI solutions, as they are on the development outcomes.
With the development of BI solutions complete, the right processes and people should be in place to
ensure that the solution delivers. You should also use supporting technologies to ensure smooth
operations. Furthermore, developing a supporting logging and troubleshooting framework can aid the
debugging and resolution of BI issues.
The aforementioned are brought together within a single operational management framework that
enables a cohesive and proactive approach to managing the BI solution in the production environment. It
also ensures the continued operation of the solution, while providing a structured approach to solving BI
issues.
Objectives
After completing this module, you will be able to:
Describe the rationale for BI operations.
Lesson 1
Rationale for BI Operations
Do you find that your organization suffers from long running reports? Does your BI processing continue
to run into core business hours? Is your team responsive to BI failures? Do your business users and
management complain about the availability of information?
Regardless of the answers, these questions highlight the increasing need for IT departments to become
more proactive when dealing with the obstacles that prevent business users from accessing information.
Information is seen as a valuable asset to an organization. Therefore, you need to ensure the continued
availability of this information, and that the supporting platforms operate and deliver the information,
based on the business requirements.
Lesson Objectives
After completing this lesson, you will be able to:
Describe the importance of BI to a business.
Not having data available to facilitate decision-making is seen as an obstacle, and this is further
compounded as businesses move to more self-service BI delivery models. Therefore, IT departments not
only need to develop more robust BI solutions, but should also provide a support model that ensures that
BI data is readily available. There must also be processes in place that can respond to BI failures, or poorly
performing BI solutions.
Organizations now recognize that they need to manage BI operations and place more emphasis on
ensuring smooth deployments, timely processing of the BI solution, and the ability to respond to BI
failures. The way this can be implemented varies between businesses. Small organizations might take a
specific, one-off approach, whereas larger organizations will have a more structured approach that might
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-3
even be part of an operational framework. However, there are common elements in each approach that
ensure you meet any preset service level agreements.
Setting an appropriate schedule and business expectations for such events by using a signed-off
agreement is critical. Typically, such agreements form the basis of service level agreements in large
organizations.
When unplanned disruptive events occur, it is important to fix the issue first. However, after the issue is
resolved, a root cause analysis should be performed. Performing such an activity means you can define
the procedures that should be in place to respond to the issue in the future. In addition, you can tell the
organization’s management about the cause of the issue so that they can use the experience to better
deal with any similar issues that might happen in the future.
Operational Frameworks
Operational frameworks supply IT services with
practical guidance for best practices, principles, and
activities to the planning, delivery, and operations
of IT. Operational frameworks are also concerned
with how the planning, delivery and operations are
managed.
Because this course is concerned with the
operational management in the context of a BI
solution, a number of questions have to be
considered, including:
The answers to these questions will ultimately consist of a combination of people, technologies and
processes that provide a cohesive approach to managing BI operations. The objective is to deliver reliable
and stable BI environments for the business, and to resume normal service if the BI solution becomes
unavailable. An example of an operational framework is the Microsoft Operations Framework.
Comprehensive coverage of the framework can be found at Microsoft Operations Framework on Microsoft
TechNet:
In the impact analysis, the terms restore point objective (RPO) and restore time objective (RTO) come to
the surface. Many SQL Server® professionals now relate these terms to when a SQL Server database
should be restored and how long that might take. However, when it comes to defining RPO and RTO,
operational management takes a more holistic approach.
In operational management, the RPO refers to the amount of acceptable data loss in the event of a critical
system going down. RTO refers to the time it takes a business to resume normal operations from the
moment a disaster occurs, to the moment normal business operations are reconfirmed by the business
stakeholders.
BI solutions are increasingly becoming a critical part of business processes. Therefore, it is important to
work with the business to identify the critical business processes, and how the BI solution relates to those
business processes. From this, you can adapt the operational framework to define service level
agreements on the availability of these critical systems, in addition to adapting the supporting operational
framework to ensure that priority is given to these systems. Finally, you will provide the business with an
appropriate contribution to the Business Continuity Planning document.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-5
Lesson 2
Roles in BI Operations
Managing BI operations consists of a combination of people, technologies, and processes that provide a
cohesive approach to BI solutions. In this lesson, you will explore the people who contribute to the
effectiveness of the operational support in a BI solution. In large organizations, one or more individuals
might fill these roles. In smaller organizations, one person might perform multiple roles.
Lesson Objectives
After completing this lesson, you will be able describe how the following roles can contribute to an
effective operational management framework for a BI solution. The roles include:
Data directors/managers
BI architects
BI developers
Database administrators
Business users
Data Directors/Managers
More organizations are employing the skills of data
directors and/or managers to oversee the strategic
direction and support of the data that is generated
within the business. A BI director will work with a
company’s board of directors to understand the
overall business strategy and direction, and
articulate how the data within the business can add
value to meeting the needs of the business strategy.
In addition, a support model for the data has to be created. To ensure ongoing operations, BI
directors/managers will devise strategic plans for supporting the solutions that have been developed. This
will involve the management of the supporting technologies, defining the appropriate roles to support
the data, and defining processes that are understood by everyone in the organization. The BI director will
work with the board to set the relevant Service Level Agreement for a given BI solution.
In the context of a BI solution, common technologies that might be supported are SQL Server Database
Engine, Integration Services, Analysis Services and Reporting Services. More recently, technologies such as
Master Data Services and Data Quality Services have gained wider adoption within BI solutions.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-7
BI Architects
A BI architect is a top-level analyst who will take
direction from the data director and create a BI
architecture that will meet the strategic needs of
the business. The BI architect will define a BI
architecture that provides a framework, using
technologies to gather and store data for reporting
and analytical purposes. BI architectures will vary,
depending on the direction that has been given by
the data director.
Nonfunctional Requirements. These requirements deal with how the solution will operate. Some of
the areas that are considered include:
o Availability
o Disaster recovery
o Backup
o Maintainability
o Configuration
o Responsiveness
Many of the nonfunctional requirements will be managed and monitored within an operational
framework.
BI architectures are typically very focused on solution architectures. In other words, the architecture is
created on the basis of providing a solution to meet a specific need. It is important that a BI architecture
be considered in the context of an enterprise solution architecture. This ensures that the solution takes
advantage of existing technologies that are used within the business, and more importantly, that the
solution is compliant. This might require the BI architect to seek input from another team that manages
the enterprise architecture.
MCT USE ONLY. STUDENT USE PROHIBITED
1-8 Introduction to Operational Management in BI Solutions
BI Developers
BI developers will typically deliver most of the
functional requirements of a BI solution. Much of
the work involves:
Managing data quality with SQL Server Data Quality Services (SSDQS).
Database Administrators
Database administrators (DBAs) are usually the data
professionals who will support a BI solution after it
has been deployed to the production environment.
This work can involve them being on call to support
the mission critical systems within the business—24
hours a day, seven days a week. Should an
organization’s BI solution be deemed mission
critical, then it is highly likely that the responsibility
for the operational management of the solution will
fall with the DBA team.
estate and how the solution can fit into that setup within the organization. They can also advise the BI
development team on the type of logging information that would be useful to help them solve errors.
There are organizations that do not employ a dedicated DBA but the activities for supporting a mission
critical BI solution will still remain. The same information regarding server configuration and support will
have to be managed. Therefore, it is important that these aspects are covered by the BI architect with the
support of the BI developers—and that support for the solution, after it has been produced, is established
early in the planning and design.
Some organizations might outsource the DBA service to third-party managed organizations—you should
liaise with the partner to determine what would come under the scope of support for the BI solution.
BI Consumers
BI consumers are regarded as the individuals in the
business who will use the data from the BI solution
for reporting and analytical purposes. From an
operational management perspective, they will
likely be one of the first channels to alert the BI
operations team to problems that are occurring
with the system.
By using Service Manager in Microsoft System Center, users can log help desk support tickets to report
incidents and change incidents for a range of IT systems and solutions. IT departments can update
incidents and track the progress of the work being done. Having a system in place provides a
consolidated view of the issues that are affecting services within the IT infrastructure. Appropriate
responses can then be made to ensure that the problem is resolved; or an update is provided for business
users.
Question: Does your organization have similar roles to the ones outlined in this lesson?
MCT USE ONLY. STUDENT USE PROHIBITED
1-10 Introduction to Operational Management in BI Solutions
Lesson 3
Technologies Used in BI Operations
A range of technologies can be used within BI solutions. Much of the focus in using these technologies is
to deliver the functional requirements of a BI solution through BI development. However, the
technologies used also provide functionality that can be valuable when managing the operations of a BI
solution.
Lesson Objectives
After completing this lesson, you will be able to:
On-Premises Technologies
Several technologies used within a BI solution
contain functionality that can help with BI
operations. These technologies include:
Operating System. This provides the platform on
which the BI solution will reside. It is often an
overlooked component when trying to solve errors,
or understand performance bottlenecks. Windows
Server® contains many tools that can give you
information regarding the operating system itself,
and the hardware. Examples of the supplied tools
that can help include:
o Event Viewer
o System Information
SQL Server Database Engine. The database engine provides the data repository that holds the data
warehouse and other supporting databases. When there are errors or performance bottlenecks, a number
of tools can be used to identify issues, including:
o Activity Monitor
o Extended Events
SQL Server Integration Services. SSIS provides the functionality to identify any ETL processes that have
failed, or that are long running. Some tools that can be used include:
o SSIS Logging
o Event Handlers
SQL Server Analysis Services. SSAS provides a wide range of tools with which you can monitor the
processing and query performance of data models, including:
o Query Logging
o Error Logging
o Flight Recorder
SQL Server Reporting Services. SSRS contains functionality that you can use to identify errors or
performance issues by using:
SQL Server Master Data Services. There is the option to enable Master Data Services logging through
tracing. This is enabled through the web.config file for Master Data services, requiring modification to the
file before it can be used.
SQL Server Data Quality Services. You can enable log setting in Data Quality Services to track any
operational issues in the Data Quality Services server, Data Quality client and the Data Cleansing task that
is used within SSIS.
Cloud Technologies
Microsoft Azure™ provides an increasing range of
services that can support BI, reporting, and
analytical solutions. Cloud solutions are being
considered by more organizations, particularly
when the hardware within their own premises
becomes defunct. The following are some examples
of services that are available to provide a cloud-
based BI solution:
Azure Data Factory. This manages the movement and orchestration of data from both on-premises and
cloud data sources.
Azure Data Lake. This provides the ability to perform data analytics that can scale to terabytes of data
using the U-SQL query language that is an extension of the SQL language—with additional C# capability
to perform distributed analytics.
MCT USE ONLY. STUDENT USE PROHIBITED
1-12 Introduction to Operational Management in BI Solutions
Power BI. This enables you to expose both cloud and on-premises data to create dashboards and rich
visualizations.
Other technologies that can be used within a cloud data architecture include:
o SQL Database
o HDInsight®
o Machine Learning
o Streaming Analytics
o Event Hubs
o Azure Search
Microsoft Azure services capture information regarding the compute and storage usage of each service—
known as telemetry. Organizations can expose some of the data in this telemetry to provide operational
insights that enable them to manage the systems.
Within Power BI, you can make use of content packs—these are prepackaged reports and dashboards,
provided by both Microsoft and third-party suppliers, that can enable you to quickly deploy reports and
dashboards on a range of areas. At the time of writing, Power BI provides the following content packs that
can help with operational management:
Visual Studio
Visual Studio provides an integrated environment
for developing a wide range of applications and
cloud services for Windows, iOS and Android
platforms. It is also an environment that can be
used for developing BI solutions, including:
SSIS packages
SSRS reports
You can also set a build to the Debug configuration—this denotes to Visual Studio that the code is to be
used for the purpose of debugging. Alternatively, you can set the code to Release configuration, which
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-13
denotes to Visual Studio that the code is ready for a final release. You can also set a build to both Debug
and Release configuration at the same time. Using these settings within Visual Studio can help manage
the releases of the code onto a local desktop or server.
The issue with using Visual Studio alone is that it does not effectively support BI development projects
where BI solutions are developed by multiple team members. However, this approach could be used for BI
solutions that are developed by a single BI professional.
Visual Studio requires the add-in Team Explorer to be installed to make use of a TFS. This enables
developers to connect to the instance of a TFS, and then create or select a Team Project Collection to host
multiple projects holding the files relating to the same code base. After connecting to a TFS, a local
workspace area should be defined on the BI developer’s desktop. This enables the project files for the TFS
server to be downloaded locally. Any changes that are made to the project files occur locally until the files
are checked into TFS.
A BI developer can add a solution to source control in the New Project dialog box. Within this dialog box
is the option to add the solution to source control. When clicking Save, the developer will be asked which
TFS server to save the project to, and then select the relevant project collection. After this has been
checked in, BI objects can be developed in the normal way—the difference being that the BI developer
must check in any saved work to the TFS server so that it is committed to the TFS database. BI developers
can see and browse the solution files that are stored in the TFS server by using Source Control Explorer.
Question: How many students in the room use Team Foundation Server in their BI solutions?
MCT USE ONLY. STUDENT USE PROHIBITED
1-14 Introduction to Operational Management in BI Solutions
Lesson 4
Environment and Operational Standards
Managing BI operations is more than just responding to errors and performance bottlenecks. The
operational framework should try to take proactive steps to ensure that the team is dealing with
environments that are well understood, and procedures that are standardized. Taking steps to define
environments and procedures will ensure that the operations team knows how to respond to issues in
known environments in a timely manner.
Lesson Objectives
After completing this lesson, you will be able to:
Test environments. Used by a smaller subset of developers and a testing team to perform unit
testing of BI functionality and performance. In this environment, developers will have more restrictive
permissions to the servers—they might only be allowed to deploy the BI solution to the testing server.
User Acceptance Testing (UAT) environments. UAT environments will be used by a select pool of
trusted BI consumers and members of the testing team. This ensures that the entire BI solution meets
the functional and nonfunctional requirements that have been captured by the BI architects. The
intention is that testing is completed successfully for sign-off into the production environment.
The preceding terms represent some of the common terms that are used to describe nonproduction
environments. Some organizations may label these environments with different names, while other
organizations might use additional environments.
It is the responsibility of the BI operations team to ensure the continued operations of production
environments—to service the business user’s information requests—and the nonproduction environments
to ensure that developers and testers can continue to develop new BI functionality.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-15
When working with multiple servers and environments, the BI operations team would be more effective
by defining:
Configuration standards for the operating system and SQL Server components.
Monitoring standards for the operating system and SQL Server components.
This will formulate part of the operational framework under which the BI team will operate.
Backups
Index maintenance
Cube processing
Defining a SOP is not necessarily only concerned with the technical detail of the work that is being
undertaken. You will also need to consider the business justification for the action, the time and frequency
the action will be executed, and the expected time it takes for the operation to be completed. Having a
SOP for these recurring activities will inform the BI operation team of what is occurring on the servers at
any given time.
Occasionally, a SOP can also be applied to activities that may not necessarily take place on a recurring
basis. For example, a business might experience performance problems with a production server. This
might occur at random times, and the BI operations team may know how to resolve the issue without
understanding the root cause. Therefore, a SOP can be defined to perform an activity to resolve the issue,
but it is executed on a given event occurring on the server, rather than at a given time.
What benefits do SOPs provide? SOPs inform the BI operations team of the ongoing procedures on a
server. A SOP advises on the actions to take when a known issue arises and can give the BI operations
team the ability to perform these actions without the formal sign-off from a data director or manager—
because the procedure has been agreed in advance of the work being undertaken.
MCT USE ONLY. STUDENT USE PROHIBITED
1-16 Introduction to Operational Management in BI Solutions
10. Optionally, define any SOPs that can be used in the future.
There are a number of objectives in taking this approach:
Adventure Works employees are increasingly frustrated by the time it takes for business reports to
become available on a daily basis. The existing managed BI infrastructure—including data warehouses,
enterprise data models, and reports and dashboards—are valued sources of decision-making information.
However, users are increasingly finding that it takes too long for the data to be processed in the overnight
load, resulting in reports not arriving to business users until the early afternoon.
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
Results: At the end of this exercise, you should have created a table that shows the roles required, with a
named employee who has key responsibilities.
2. Name the project AWMigration, in a solution named AWMig, and then save it in the
D:\Labfiles\Lab01\Starter folder, and store in Source Control.
4. Confirm that the solution AWMig.sln has been added to TFS using Source Control Explorer.
Results: At the end of the exercise, you will have configured Team Explorer to connect to a TFS server
named mia-sql. You will have created a project collection and stored an Integration Services project
within the project collection in the TFS server. You will have made a change to an object and checked the
object back in to TFS. Finally, you will view the changes in Source Control Explorer.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 1-19
Question: Based on the interviews in the lab, discuss the findings of the group regarding the
role assignments and the responsibilities of each role. Are there any roles missing?
Question: Based on the interview document, how would you improve the BI developer’s
current working environment?
MCT USE ONLY. STUDENT USE PROHIBITED
1-20 Introduction to Operational Management in BI Solutions
Module 2
Configuring BI Components
Contents:
Module Overview 2-1
Lesson 1: The Importance of Standardized Builds 2-2
Module Overview
The correct configuration of the BI components within the SQL Server® product stack will have a big
impact on the stability and performance of the overall BI solution. Configuring components by using best
practice will enable the BI operations team to rule out the data platform as a root cause of issues that
occur within the environment.
Defining standards for server builds can help the effectiveness of the team to resolve issues in a known
configured environment. Equally, understanding the type of architecture that a BI solution is implemented
in will drive the standards for a given architecture.
Objectives
After completing this module, you will be able to:
Describe BI architectures.
Lesson 1
The Importance of Standardized Builds
Standardized builds is the process of documenting and implementing a server or software configuration
for a given technology. It encourages specific configurations to be implemented using best practice, and
provides the following benefits:
It provides a known configuration for a server and/or software for a given environment.
It ensures a consistent configuration across multiple servers that an operational team can support.
It can improve the effectiveness of an operational team reacting to an issue within an environment.
Standardization is used in many aspects of a business. Applying standardization to a BI operational model
will enable a business to achieve efficiencies when supporting the solution operationally.
Lesson Objectives
At the end of this lesson, you will be able to:
Availability
Availability can be implemented at a hardware level and include hardware solutions such as redundant
array of independent disks (RAID) to protect against hard disk failure, and dual power supplies to provide
redundancy in power. Microsoft® Windows® and SQL Server also include features that provide
redundancy at the server and database level, including:
Log Shipping.
Peer-to-Peer replication.
Deciding which SQL Server technology to implement depends on a number of factors, such as whether
the solution needs protecting at a server level, or at a database level. For server level protection, consider
Windows Server Failover Clustering. For database protection, Always On Availability Groups, log shipping,
and peer-to-peer replication provide different levels of protection. Agreeing requirements for how
database failures should be handled will influence the final decision on which technology will meet the
business requirements.
Defining standards for how the hardware and software features are to be configured improves
consistency, and can help with capacity planning for a given feature. For example, you could define that
the Quorum drive of a Windows Server Failover Cluster is identified as Q:\ drive, stored on a SAN, with a 1
GB capacity, or that all Windows Server Failover Cluster configurations must use dual power supplies, and
RAID 5 arrays.
Disaster Recovery
Disaster recovery is the process of recovering systems that have failed in line with the agreed restore time
objective (RTO) and restore point objective (RPO). The technologies that can be used to facilitate disaster
recovery include:
Windows Server Failover Clustering.
Log Shipping.
Peer-to-Peer Replication.
Maintainability
Maintaining the data and structures in SQL Server databases, and maintaining the operations of additional
SQL Server technologies, should be performed on a regular basis. This will ensure that the services
continue to provide data to users that is both functional and performant. Standards should be defined for
the following operations:
Backup
In addition to the standard database maintenance, BI solutions require the data to be maintained
through:
SSIS execution
Cube processing
Report generation
Typically, all of the preceding activities are scheduled to perform on a regular basis. Standards should be
defined and form the basis for standard operating procedures within the business.
MCT USE ONLY. STUDENT USE PROHIBITED
2-4 Configuring BI Components
Performance
Defining whether high performance or consistent performance is more important to the business will help
determine a consistent approach to configuring the BI supporting technologies.
Hardware Standards
The hardware on which the BI solution resides will
have an impact on the performance and the
operations of the solution. In an ideal world,
mission critical services would run on dedicated
hardware. However, many businesses have to
balance the needs for a service against the cost of
running dedicated servers. Only mission critical
servers that are proved to cause a monetary loss
should the service stop, are afforded the luxury of
a dedicated server.
Network
Hard disk
Memory
CPU
The minimum specifications required for the version of Windows and SQL Server that is being used should
provide a starting point for configuring the hardware. In addition, there are other practices that should be
followed to optimize the hardware usage of the physical server.
Network
For BI solutions, you can use fast network cards for production servers. In addition, static IP addresses
should be configured on the server on which SQL Server resides. For high availability solutions, multiple
network cards should be installed to provide redundancy. For greater redundancy of network failures,
configure the network cards with multiple DNS and default gateway addresses.
For SQL Servers that deal with high volumes of data transfer across a network, which is typical of data
warehouse loads, you should enable jumbo packets on a network card. This will increase the volume of
data that can be handled by each network packet from the default 1,581 bytes up to 9,000 bytes. This
increase will reduce the overheads of handling traffic at the network layer, but all devices should support
this feature, including switches.
6. Select Jumbo Frame and change the value from disabled to the desired value.
Hard Disks
Hard disks have an impact on the performance of a BI solution, including data warehouse load times,
analysis services processing times, and report performance.
Hard disk specification has the biggest impact on a data warehouse. As this is stored in the database
engine, the following best practices apply:
Configure NTFS cluster size to 64 KB to align with the 64 KB extents used by SQL Server.
From this best practice, three separate volumes would be needed to meet these requirements. If this is not
possible, define standards based on the resources available, and set expectations for service availability
and performance.
For high availability, consider a RAID configuration to provide redundancy The following standard RAID
options are available:
RAID 0 – Stripe set. Data is divided between all available disks with no redundancy.
RAID 1 – Mirrored RAID array. The contents of one disk are duplicated on another.
RAID 5 – Striped set with parity. Data is divided across all but one disk, which contains a parity bit.
RAID 10 – Mirrored stripe set. Combines the performance benefits of RAID 0, with the redundancy
benefits of RAID 1.
Typically, databases are stored on a Storage Attached Network (SAN). SANs can be configured in different
ways; for example, they can be partly configured with traditional disk based storage, with the remainder
utilizing SSD technology. In fact, some SAN technologies can adaptively change the content that is stored
on an SSD or traditional storage—this is called adaptive optimization.
This might appear to provide benefits, but can cause problems. For example, a SAN that typically stores a
database on an SSD may drop the database to traditional storage because web developers are running
performance tests. Because of the high usage, the web server gets promoted to use the SSD at the
expense of the database. In this case, turning off adaptive optimization would be more desirable.
Memory
Make at least 2 GB of server memory available to the operating system, as SQL Server operates on top of
the operating system. Additional memory should be allocated for:
Consider the workloads of the applications running on the server. For example, most business users will
query approximately 20 percent of the data that is held in a data warehouse. As a result, if a data
warehouse is 50 GB in size, you would want to allocate 10 GB of memory as a starting point. For a true
picture of the workload, you can run Windows Performance Monitor, looking at the Process: Working
Set counter against the SQL Server process to identify how much memory is being used.
MCT USE ONLY. STUDENT USE PROHIBITED
2-6 Configuring BI Components
Alternatively, for an average view since the last SQL Server restart, you could run the following command:
Querying SQL Server memory usage with the sys.dm_os_process_memory DMV (dynamic
management view)
SELECT
(physical_memory_in_use_kb/1024) AS Memory_usedby_Sqlserver_MB,
(locked_page_allocations_kb/1024) AS Locked_pages_used_Sqlserver_MB
FROM sys.dm_os_process_memory;
This query returns a record of how much physical memory is being used, and how much memory is
locked. Locked memory will be reserved for SQL Server, and will not yield to the request from the
operating system to take this locked memory.
CPU
In the first instance, allow the CPU to be available to the operating system. This can be one CPU on
systems that contain up to four CPU cores, or two CPUs for systems that go beyond four CPU cores. As
with memory, there should be a balance of the CPUs across the applications that are running on a server.
The remaining decisions on CPU should account for:
The minimum CPU required by non SQL Server applications; for example, Internet Information
Services (IIS).
You can establish the current CPU usage using Windows Performance Monitor—specifically, Processer: %
Privileged Time, which measures the percentage of CPU that Windows is using, and Processor: % User
Time, which measures the percentage of CPU that other applications, such as SQL Server, are using in real
time. You can also use a range of dynamic management to look at CPU usage by query or by database, in
addition to other factors.
SQL Server does not distinguish between whether it is running on a physical server or a virtual server, so
there are the same considerations for networking, hard disks, memory, and CPU. One key point is that
multiple virtual servers can exist on the same physical host. It is important to ensure that the multiple
virtual machine hardware settings are not collectively higher than those that are available on a host. For
example, a physical server with 192 GB of RAM hosting four virtual servers with 64 GB of RAM allocated to
each virtual server. In this case, memory pressure could exist on all virtual servers.
Software Standard
The key to managing operational environments is
consistency. Defining standards ensures that
consistency is applied across all servers in all
environments. The following key practices should
be followed when considering software standards
that are to be supported by a BI operational team.
The decision on which version of Windows to use may rest with an enterprise architect. It is important that
the BI operational team understand the corporate standards for using Windows and adhere to them. The
chosen version of Windows should also include the relevant Service Pack or Cumulative Update version.
All Windows servers should meet this corporate standard.
Define the version of SQL Server that represents the corporate standard
The BI architect should provide information to the BI operations team about which version of SQL Server
to use, including any Service Pack or Cumulative Update that should be installed on the SQL Server.
Define the edition of Windows and SQL Server for different environments
The edition of Windows and SQL Server to be used will depend upon the environment on which these
technologies are installed. For example, a nonproduction environment, such as a development
environment, might use Windows Server 2016 Standard Edition with SQL Server 2017 Developer Edition.
The main driver for making this decision is usually an attempt to reduce costs—the licensing cost for these
editions is lower than other editions, though the SQL Server feature set is the same as that in the SQL
Server 2017 Enterprise Edition.
Many organizations use antivirus software to provide protection for both the server and the desktops.
Antivirus software can slow down the performance of a computer that is running SQL Server. Antivirus
standards should be defined for SQL Server files and for how they are handled. If you have been given
approval by an organization’s security team, it is common for antivirus exceptions to be defined for the
following file types on a computer that is running SQL Server:
*.mdf
*.ndf
*.ldf
*.BAK
*.TRN
*.TRC
In addition, Windows Server Failover Clusters should not have antivirus software installed, because this is
known to cause issues with a cluster.
At an individual server level, you can run the following command on a Windows server to ascertain which
version of Windows is running:
At an individual server level, you can run the following command on a Windows server to ascertain which
version of Windows is running:
winver
MCT USE ONLY. STUDENT USE PROHIBITED
2-8 Configuring BI Components
You can also run the following command in SQL Server Management Studio to establish which version of
SQL Server is running:
You can also run the following command in SQL Server Management Studio to establish which version of
SQL Server is running:
If you need to look at the version of an entire SQL Server and Windows estate, you can use the Microsoft
Assessment and Planning (MAP) toolkit. This can read the list of servers on a network, in addition to
providing information about the servers and installed Microsoft applications. For more information about
the MAP toolkit, see Microsoft Assessment and Planning (MAP) Toolkit in Microsoft TechNet :
The hardware and software installed on the cluster is certified for the version of Windows being
installed.
This policy ensures that the cluster is configured to a desired standard before product support is initiated.
It also means that the WSFC is at its most optimized for providing the high availability and disaster
recovery that it is designed to do. You should, therefore, consider the following best practices when
setting up any server—referred to as a node—which is part of a WSFC:
Use the same certified hardware on all servers.
For the best performance, ensure all nodes are on the same network links.
For SSIS, or if you run distributed transactions, install the Microsoft Distributed Transaction
Coordinator.
Run the cluster validation wizard, as it will check many of the best practices required for a cluster.
Always On Availability Groups provides protection and recovery at a database level. There is a single
primary database and up to eight secondary databases. Data can be moved either synchronously or
asynchronously to a secondary replica—which can also be read from—or have administrative routines,
such as backup, performed on the secondary replica. A group of databases can be placed into an
availability set and can fail over as a single unit automatically, manually or by forcing a failover. An
availability set should reside on a WSFC to provide server protection as well. When using Always On
Availability Groups, use the following best practices:
For best performance, use a dedicated network card for availability group traffic.
Log Shipping
Log shipping provides a low-cost solution to high availability that can support multiple partners in this
availability solution. The hardware requirements are not as strict as those for a WSFC—the failover also
has to be handled manually. In situations where a degree of downtime is deemed acceptable, log
shipping is a useful high availability solution. Use the following best practices when configuring log
shipping:
Consider placing log shipping on the same network to mitigate against Internet failures.
Peer-to-Peer Replication
Peer-to-peer replication provides a high availability solution by maintaining copies of data across multiple
server instances. Built on transactional replication, peer-to-peer replication provides transactionally
consistent changes, in near real time, across multiple nodes. As data is maintained across the server in
near real time, peer-to-peer replication provides data redundancy, which increases the availability of data.
Like log shipping, the hardware requirements are relaxed, and the same considerations should be taken
into account.
Lesson 2
Configuration Considerations for BI Technologies
With the range of technologies available with which to implement a BI solution, there are many ways that
each technology can be configured. Each technology has certain configurations that should be
implemented to either improve performance, maintainability, disaster recovery, or availability, as defined
by the business requirements. Standards should be defined with production servers in mind, and then
adopted across other environments. If there are existing servers, changes should be applied and tested to
bring these servers to a level that meets the BI operations standards.
Lesson Objectives
At the end of this lesson, you will be able to define key configuration standards for the following:
Operating system
Database Engine
SQL Server Integration Services
Lock pages in memory is a security setting that means you can define an account to keep data in memory
without the data being moved to virtual memory, when a Windows server experiences memory pressure.
You can use this in conjunction with the SQL Server instance option of Max Server Memory to define how
much memory is locked. This can improve the consistency of memory usage of SQL Server and enhance
performance. The SQL Server service account must be added to the lock pages in memory using the
following steps:
1. On the Windows desktop, at the lower-left, click the Start icon and type gpedit.msc.
2. On the Local Group Policy Editor console, expand Computer Configuration, and then expand
Windows Settings.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-11
6. In the Lock pages in memory Properties dialog box, click Add User or Group.
7. In the Select Users or Groups dialog box, add the SQL Server service account with privileges to run
sqlservr.exe, and then click OK.
8. Log out and then log back in for this change to take effect.
For certain SQL Server operations, disk performance can be improved through a process named instant
file initialization. When a database operation, such as creating a new database, is performed in SQL Server,
a process of zero initialization takes place. For example, if a new database is created that is 200 GB in size,
Windows will place a zero value in every byte of space within the 200 GB. The process of adding zeros can
be avoided if the SQL Server service account is added to the local security policy of Perform Volume
Maintenance Task, using the following steps:
1. On the Windows desktop, at the lower-left, click the Start icon and type gpedit.msc.
2. On the Local Group Policy Editor console, expand Computer Configuration, and then expand
Windows Settings.
3. Expand Security Settings, and then expand Local Policies.
6. In the Perform volume maintenance tasks Properties dialog box, click Add User or Group.
7. In the Select Users or Groups dialog box, add the SQL Server service account with privileges to run
sqlservr.exe, and then click OK.
8. Log out and then log back in for this change to take effect.
You can adjust the performance options so that Windows can be configured to prioritize background
services over applications. You can also adjust the visual effects for best performance in the same area to
free up memory to the operating system. To modify these settings, you should perform the following
steps:
8. On the Advanced tab, under Processor scheduling, click Background services, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
2-12 Configuring BI Components
Power Options manages how your server uses power. The default option is balanced, but you can improve
SQL Server performance if you change the power option to High Performance—the CPU cycles used to
manage power is reduced. To set Windows Power Options, you should perform the following steps:
1. Click the Start button and type Control Panel, and then press Enter on your keyboard.
2. Click Hardware.
5. Select High performance and type a name for the power plan. Click Next.
6. Select how long the system's display should stay on before the computer goes to sleep. Click the
Create button to create the custom plan.
Database Engine
To affect stability and performance, a range of
configurations can be made to the database
engine. These can be considered on a case-by-
case basis. The operations team should use a
baseline standard for SQL Server instances and
adjust that accordingly with the supporting
documentation. You should also check that
existing servers in your SQL Server estate also
meet these standards. The baseline standards
should include:
Separating database and log files
It is common to see database and log files on the same drive; you can move a file by performing the
following steps:
--Modify the file using the logical filename and define a new location
ALTER DATABASE [AdventureWorks]
MODIFY FILE ( NAME = AdventureWorks_Data, FILENAME = 'D:\Data\AdventureWorks_Data.mdf' );
--Move the database log file to the new location in windows explorer
Like any database, the tempdb database and log files should be placed on separate disks. In addition, you
should place the tempdb on the fastest disk possible. Many Transact-SQL and SQL Server activities can
cause intense usage of the tempdb database, including:
In addition, contention of the tempdb can occur on the Page Free Space (PFS) page in the tempdb.
When temporary objects are created, one of the jobs of the PFS page is to identify pages that are free to
use. After an extent is allocated to a temporary object, a Global Allocation Map (GAM) page records what
extents have been allocated to data. Typically, there is one GAM page to track approximately 64,000
extents, or around 4 GB of data. There is also a Shared Global Allocation Map (SGAM) that denotes if an
extent is a uniform or a mixed extent—it also tracks approximately 64,000 extents, or around 4 GB of data.
To reduce the contention on the tempdb database, and if your server contains eight or less CPU cores,
you can create the same number of tempdb data files as you have cores. If you have more CPUs, you can
use eight data files as a guide to the maximum number of equal data files you should create. This will
create at least one PFS page per data file, so spreading the allocation contention and reducing pressure
on the tempdb database. These data files can reside on the same drive, because the intention is to create
more PFS pages. Creating more tempdb data files will likely be a false economy; however, if you wish to
use more data files, you can test the impact.
MCT USE ONLY. STUDENT USE PROHIBITED
2-14 Configuring BI Components
Memory settings can be used to determine the maximum amount of memory that SQL Server can use.
This can help balance memory management on servers that run multiple services, and also configure SQL
Server to leave memory available to the operating system. Used with the Lock Page in Memory security
policy, this can also fix the memory that SQL Server can use, including the minimum amount of memory.
To configure the memory setting within SQL Server, execute the following Transact-SQL:
To configure the CPUs on a SQL Server instance perform the following steps:
Typically, in the optimize phase, the full execution plan is stored in memory for future reuse. When
optimize for ad hoc workloads is enabled, a smaller stub plan is stored in memory on the first execution—
it becomes a full plan on a subsequent execution. This ensures that one of the queries does not persist a
full plan in the execution cache found in memory, thereby reducing the memory used.
Taking backups is an important but routine task for the full protection and disaster recovery of data if the
database becomes offline. However, before this is done, it is important to decide which recovery model to
use. There are three options:
Full
Bulk_logged
Simple
If you require full protection, and the best option for full recovery in a disaster, you should use the full
recovery model. The full recovery model logs all transactions to the transaction log of the database. You
should select this model if data recovery is an important business requirement. When this model is
selected and combined with a regular database and transaction log strategy, you can recover data to a
certain point in time.
The bulk_logged recovery model logs individual transactions such as inserts, updates, and delete
statements to the transaction log. Transact-SQL statements, such as SELECT INTO, bcp, OPENROWSET,
WRITETEXT, UPDATETEXT, and BULK INSERT, are minimally logged.
The simple recovery model provides the most straightforward form of backup and restore because it does
not support transaction log backups. Therefore, data loss can occur, because you can only restore a
database from the most recent backup.
SSIS processes data in memory known as buffers. Data flows through pipelines in the buffer as efficiently
as possible. Developers can influence the space used in memory by adjusting a number of SSIS properties
that control the amount of data that is placed in a buffer.
MaxBufferSize is a nonconfigurable property in an SSIS package that has a value of 100 MB. This means
that data in a buffer cannot exceed 100 MB per dataset—otherwise SSIS will create another buffer of 100
MB and split the data. This can have an impact on memory usage. For example, if 150 MB of data is
loaded in a dataset, SSIS will reduce the dataset and spread the data across multiple buffers. However, this
may mean that only approximately 75 MB of data is stored in each buffer—this will waste 25 MB per
buffer page. There is also an Estimated Row Size property that contains metadata about the
approximate size of data. As a result, you can adjust the following properties to influence the buffer size:
If you are aware of a dataset row size, in addition to the number of rows, these values can be adjusted to
maximize the memory usage of an SSIS data flow. This can be difficult to calculate with some tables in a
data warehouse, but for tables such as dimension tables that are relatively static, this can improve the
performance of the data load.
Parallelism can occur in two ways within an SSIS package. An SSIS package execution can be designed in
such a way that it can execute multiple control flow at the same time. You can also use the
MaxConcurrentExecutables package property at the same time. The MaxConcurrentExecutables
property determines the number of threads that can be executed concurrently. The default setting of -1
provides a number of threads that is equal to the number of CPU cores on a server, plus two. For example,
on a server with four processors, this would equate to six MaxConcurrentExecutables. This setting can be
increased and will benefit CPU utilization of SSIS servers that run on dedicated servers.
Advise SSIS developers not to use blocking transformations in SSIS data flow but instead look for
alternatives. Nonblocking transformations include:
Aggregate
Fuzzy Grouping
Fuzzy Lookup
Sort
These types of transformations create additional buffer space, introduce new threads when the
transformations are being executed, and can affect the performance of the SSIS server. For example,
instead of using a Sort transform to organize data, it is better to use an ORDER BY clause in a Transact-
SQL statement when retrieving data from a source system.
You can use package templates to include control flows, data flows, and associated properties that are
part of a standardized package for future reuse. This can help in maintaining agreed standards. To create
a package template, you create an SSIS package within Visual Studio. Add the common objects to the
package, then save and close Visual Studio. You should then copy the SSIS dtsx file created from the file
system to the following location:
MCT USE ONLY. STUDENT USE PROHIBITED
2-18 Configuring BI Components
3. In Solution Explorer, right-click the SQL Server Integration Services project, point to Add and click
New Item.
4. In the Add New Item dialog box, double-click the package template created.
You can define the location of the Analysis Services data in the instance properties of Analysis Services.
This provides the opportunity to control the storage location of the data that meets the needs of the
environment in which Analysis Services is installed. To change the data directory location, perform the
following steps:
2. In the Connect to Server dialog box, ensure the Server Type states Analysis Services, and specify a
server and credential to connect to the instance, and then click Connect.
4. In the Analysis Services Properties dialog box, under Select a page, click General.
5. In the fourth row, that states DataDir, go to the second column and click the ellipsis (…) button.
6. In the Browse for a Remote Folder dialog box, browse to a folder location and click OK.
In the Analysis Services instance properties, in the General page, a number of options can enable you to
tune how Analysis Services manages the memory on the server. Set the memory options to account for
the operating system and other applications—the memory settings define how SSAS will manage
connections and memory when thresholds are reached or exceeded. The settings are expressed as a
percentage when a value of 0 to 100 is specified; a value above 100 changes the measurement to bytes,
and includes:
HardMemoryLimit. The default value is 0—this means that the memory usage of SSAS is a
percentage value between the total physical memory and the setting defined in TotalMemoryLimit.
When the threshold is reached or exceeded at this point, Analysis Services will terminate sessions
returning an error. The default value, zero (0), means the HardMemoryLimit will be set to a value
midway between TotalMemoryLimit and the total physical memory of the system.
LowMemoryLimit. The default value is 65, which is 65 percent of physical memory, where SSAS
clears memory out of caches by closing expired sessions and unloading unused calculations.
VertiPaqPagingPolicy. This applies to tabular server mode only, and specifies whether Tabular SSAS
should page memory to disk when there is memory pressure. A value of 0 prevents paging to disk; 1
is the default and enables paging.
VertiPaqMemoryLimit. This applies to tabular server mode only. If paging to disk is allowed, this
property specifies the level of memory consumption (as a percentage of total memory) at which
paging starts. The default is 60.
OLAP\BufferMemoryLimit. The default value is 60, which denotes the amount of memory that can
be used for processing cubes.
Use partitioning in Enterprise Edition
You should recommend that developers use cube partitioning in the Enterprise Edition of SQL Server.
Partitions can be used as storage containers for data and aggregations of a cube. Every cube contains one
or more partitions. For a cube with multiple partitions, each partition can be stored separately in a
different physical location. Each partition can be based on a different data source. Partitions are not visible
to users; the cube appears to be a single object.
MCT USE ONLY. STUDENT USE PROHIBITED
2-20 Configuring BI Components
Memory settings can be specified to set a lower and upper limit on the amount of memory that is used by
Reporting Services, giving greater control over server resources. This includes:
WorkingSetMinimum. This specifies the minimum amount of memory in kilobytes that is to be used
by the reporting server.
These settings must be manually added to the rsreportserver.config file. In addition, thresholds can be set
that cause the report server to change how it prioritizes and processes requests—depending on whether it
is under low, medium, or heavy memory pressure. Configuration settings that control memory allocation
for the report server include:
MemorySafetyMargin. This is a value of memory, specified in kilobytes, that determines the low
memory pressure band between the WorkingSetMinimum value and the MemorySafetyMargin value.
MemoryThreshold. This is a value of memory, specified in kilobytes, that determines the medium
memory pressure band between the MemorySafetyMargin value and the MemoryThreshold value.
The high memory pressure band is between the MemoryThreshold value and the WorkingSetMaximum
value
Configuration. This involves setting the boundaries for each of the memory pressure bandings in the
RSReportServer.config file.
Use caching and snapshot reports
To optimize the user’s experience of browsing reports, the BI operations team can set the properties of a
report in the web portal to use caching or snapshot reports. Cached reports store a copy of the data
retrieved from a data source in the ReportServerTempDB database. Snapshots are stored in the
ReportServer database, and can also be used to retain a historical snapshot of the data.
The BI operations team should define the standards for which reports would make use of a cache—and
which reports would make use of snapshots. The factors that would influence the standards include:
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-21
Report subscriptions.
Reporting Services uses the SQL Server Agent to manage the schedule for the execution of the reports,
and the BI operations team will define a timetable for when report execution schedules should take place.
You could spread out the time the scheduled executions take place. It is common for many report
executions to occur overnight at the same times during the schedule, causing the server to be under
unnecessary pressure. Organizing the schedule to spread the load of executions will benefit the server.
DQS_MAIN
DQS_PROJECTS
DQS_STAGING _DATA
All these databases should be implemented following database best practice, such as separating data and
log files onto separate disks and performing regular backups.
Data Quality Services can integrate well with Master Data Services but, for this to occur, both services
must be running on the same instances of SQL Server—and on the same server. To configure integration,
you should perform the following steps:
1. Open Master Data Services Configuration Manager.
3. On the Web Integration page, click Enable Integration with Data Quality Services.
Question: Which area of the BI technology stack will you explore in more depth when you
get back to your organization?
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-23
Lesson 3
BI Architectures
SQL Server BI solutions are supported on a wide range of platform architectures that are designed to
meet the business requirements. The BI operations team is highly likely to be supporting a wide range of
platform architectures across the given environments. By using the configuration standards outlined in the
previous topic, different architectures can be implemented to meet the availability, disaster recovery,
maintenance, and performance needs of a solution.
Lesson Objectives
At the end of this lesson, you will be able to define common architecture and the appropriate
configuration standards for:
Stand-alone architectures
Stand-alone Architectures
Stand-alone architectures refer to SQL Server
installations that are performed on a single
server—and there are multiple services running on
the same physical or virtual servers. Scenarios in
which stand-alone architectures are found include:
In stand-alone environments, it is likely that compromise will be required in the configuration of the
various servers. However, certain configuration standards should not be compromised because they could
prevent access to a service; negatively affect the performance of the server; or compromise the
recoverability of the service. These standards can include:
Implementing operating system standards. The operating system setting can aid the performance
of the server. One setting that may be compromised is Lock pages in memory—this should be
considered on a case-by-case basis.
Balancing memory and CPU utilization across services. It is important to ensure that all services
running on the single server have enough memory and CPU allocated to them to operate efficiently.
Defining database recovery models. Database recovery models affect the level of backup and
restore capabilities. For any database that needs to be restored to a point in time, you should ensure
that the recovery model is set to full.
MCT USE ONLY. STUDENT USE PROHIBITED
2-24 Configuring BI Components
Adding an Analysis Services administrator account. An Analysis Services instance that does not
have an administrator defined will be inaccessible—in this case, access must be granted by the built-
in administrators.
The area of compromise is the management of the databases and database files. Typically, stand-alone
architectures may not be endowed with many hard disks. As a result, databases and their associated files
might have to be collocated on the same volumes. In this scenario, the following databases and database
files should be managed in the following priority:
The third area covers any additional settings that are agreed, such as antivirus software configuration and
third-party application configuration requirements.
In the scenario on the slide, a Reporting Services instance is set up in native mode in a high availability
architecture. The following standards should be adopted:
On all servers:
1. Servers involved in the WSFC must pass the cluster validation test.
1. The Reporting Services host name and IP address of the NLB should be registered in DNS.
2. The IP addresses of the SSRS1 and SSRS2 should be enlisted in the MLN to accept round robin
requests for access to the report server.
For more information about high availability for Reporting Services, see:
http://aka.ms/Psec4f
As a result, additional standards should be applied to the Azure virtual machine, including:
In addition, Azure Infrastructure as a Service (IaaS) components will have to be configured to allow
network communication from the on-premises servers to the server in the cloud, including:
1. A dedicated connection such as ExpressRoute, or a VPN tunnel for traffic over the Internet.
2. The creation of a virtual network that will host the virtual machine.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-27
3. Enabling a firewall setting on the Azure virtual machine to allow remote Reporting Services requests.
If a failover occurs, you should complete the following steps before resuming operations:
1. Stop the instance of the SQL Agent service that was being used by the primary database engine
hosting the Reporting Services databases.
2. Start SQL Agent service on the computer that is the new primary replica.
4. If the report server is in native mode, stop the Report Server Windows server by using Reporting
Services configuration manager.
5. If the report server is configured for SharePoint mode, stop the Reporting Services shared service in
SharePoint Central Administration.
For more information about Reporting Services and availability groups, see:
http://aka.ms/Eja06d
Question: Whilst isolated services would be a desired architecture for many BI production
scenarios, how do you manage balancing services on a stand-alone architecture setup? Do
you see any options that have just been presented that could help your current situation?
MCT USE ONLY. STUDENT USE PROHIBITED
2-28 Configuring BI Components
Lesson 4
SharePoint BI Environments
Some organizations choose to surface the presentation layer of their BI solution within a SharePoint
solution. SharePoint enables the sharing and collaboration of BI assets across the business in a controlled
and secure manner. It can integrate with SQL Server technologies such as Reporting Services, and other
presentation technologies, including PowerPivot and PerformancePoint. When supporting a SharePoint BI
solution, standards are equally important to enable a BI operations team to support the solution in an
effective manner.
Lesson Objectives
At the end of this module, you will understand the SharePoint considerations for:
Hardware
Hardware Considerations
SharePoint Services is a multitier architecture that
enables the sharing and collaboration of business
documents. There are three layers to the
SharePoint Server architecture:
Application layer
This layer includes applications such as Reporting
Services, PowerPivot, and PerformancePoint.
Database layer
The BI operations team might be required to support the solution at both the database and application
layers. Meeting the minimum standards and accounting for capacity based on the databases used will
ensure that the SharePoint server is not put under resource contention. From a database layer perspective,
standard operating system and database practice should be employed. This will include databases related
to both SharePoint and application databases, including:
SharePoint_Config
SharePoint_AdminContent_<GUID>
WSS_Content
AppManagement
Bdc_Service_DB_<GUID>
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 2-29
Search_Service_Application_DB_<GUID>
Search_Service_Application_AnalyticsReportingStoreDB_<GUID>
Search_Service_Application_CrawlStoreDB_<GUID>
Secure_Store_Service_DB_<GUID>
SharePoint_Logging
DefaultPowerPivotServiceApplicationDB_<GUID>
ReportingService_<GUID>
ReportingService_<GUID>_TempDB
80 GB minimum.
Excel Services
Excel Services enables the sharing of Excel® files and is a prerequisite service for Power Pivot for
SharePoint. Excel Services is configured in the application layer.
MCT USE ONLY. STUDENT USE PROHIBITED
2-30 Configuring BI Components
Power Pivot for SharePoint enables Power Pivot functionality with a Power Pivot instance of Analysis
Services, created in SharePoint Server by using SQL Server setup.
PerformancePoint Services
PerformancePoint Services is installed as part of SharePoint setup on the application servers, on the
SharePoint farm. SharePoint Central Administration can be used by the BI operations team to complete
the configuration of PerformancePoint and other BI services hosted within SharePoint Server.
This service is required when, for example, Excel Services has to communicate with a remote data source
that is hosted outside of the SharePoint farm. For a BI implementation of a SharePoint farm, the Claims to
Windows Token Service must be configured on the same server on which Excel Services is installed.
Adventure Works employees are increasingly frustrated by the time it takes for the business reports to
become available on a daily basis. The existing managed BI infrastructure, including a data warehouse,
enterprise data models, and reports and dashboards, are valued sources of decision-making information.
However, users are increasingly finding that it takes too long for the data to be processed in the overnight
load, resulting in reports not arriving to business users until the afternoon.
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
3. Close WordPad.
MCT USE ONLY. STUDENT USE PROHIBITED
2-32 Configuring BI Components
Results: At the end of this exercise, you should have created a table that shows which areas of the data
platform should be standardized, including:
2. Set the Lock pages in memory for the SQL Server Database Engine service account.
2. Write a query to set the instance level property to optimize for ad hoc workloads.
3. Execute the query.
2. Move the tempdb data files to the G:\Microsoft SQL Server\MSSQLSERVER\Data folder.
Question: Are there any additional changes you would have made to the server that was
configured?
Incorrectly configured memory and CPU settings for multiple services on a single server.
Data and log files stored on the same drive.
Local security policy settings of an operating system not being applied to servers.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
3-1
Module 3
Managing Business Intelligence Security
Contents:
Module Overview 3-1
Lesson 1: Security Approach to BI Solutions 3-2
Module Overview
Managing the security of data within an organization should be the first priority for any operational team.
Not only could the ramifications of a data leak lead to commercial losses, but there may also be legal and
financial penalties that could have wider implications for the business.
It is very important that the business intelligence (BI) operations team takes a holistic approach to
securing the data. Considerations should include the physical security of the data, in addition to
protection at an operating system or SQL Server® level. Transfer of data to other sites and data at rest
may have to be protected. In these cases, encryption components come into play. Meeting compliance
requirements may force a business to track activity on SQL Servers, or provide access to data by using
auditing.
Objectives
At the end of this module, you will be able to:
Lesson 1
Security Approach to BI Solutions
The approach taken by the BI operational team should fall in line with any security policies that have been
defined within the organization. This approach should apply throughout the entire technology stack, and
consideration should also be extended to include nontechnology security issues. This should lead to a
culture of security in depth, with the aim of mitigating against any potential weaknesses that could expose
unauthorized access to the data.
Lesson Objectives
After completing this lesson, you will be able to:
Physical security
Physical access to the servers that host the data should be controlled. Many organizations that hold
sensitive data will host servers within a locked room where access is controlled. Other organizations will
store data within remote data centers that are hosted by a managed service provider, such as Microsoft®
Azure™. These data centers operate a very strict policy to control who can gain access to their locations.
For auditing purposes, key cards are typically used to log the individuals who have accessed a server room
or data center.
People
There are examples of breaches in security that have occurred because of social engineering—where
someone elicits information from another individual to gain unauthorized access, either to data or data
centers. It is important that the organization breeds a culture of defense in depth, where it is deemed
appropriate for an employee to question the actions of an individual when security is involved. Employees
should feel comfortable with this situation.
The BI operations team should evaluate security requests on a case-by-case basis. There is a difference
between what a user wants, and what a user needs. Users may request elevated privileges out of
convenience, or to bypass a process. If a request is thought to be inappropriate, the BI operations team
should reject it and give reasons for this course of action.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-3
Update maintenance
Windows and SQL Server updates should be maintained on all servers. Service Packs and Cumulative
Updates will contain security updates that are designed to reduce security vulnerabilities. It is important to
provide a nonproduction environment that enables the testing and impact analysis of installing an update
before it is installed within a production environment. It is equally important to inform the business when
SQL Server falls out of scope for Microsoft support—a couple of years’ notice is normally given but, after
this time, there is no guarantee that further updates will be available to help protect against future
vulnerabilities.
Surface area reduction involves installing only the technology that is required for a service to be provided
to a business. This may form part of a standardized SQL Server build, a subject covered in Module 2 of this
course. Removing or disabling unused services and applications from a server reduces the opportunity for
an attacker to interrogate SQL Server or Windows components.
When the authentication model for SQL Server is set to Windows authentication or mixed mode
authentication, you can use domain objects to control access to a SQL Server. Active Directory® is a
directory services database that holds information about objects including computers, users, and groups.
Objects can be managed and secured through a central management console. The network team will
already be managing security access to users through group management. The range of groups available
includes:
Domain local groups. These are used to define access to resources such as a SQL Server login, and
typically contain users and other groups as members.
Global groups. These are used to group together a collection of users and other global groups from
the same domain, to represent people who work in the same function or department. For example, a
global group named Accounts may contain users or other global groups named Accounts Payable
and Accounts Receivable. All users who are part of all of the groups will be members of the Accounts
global group.
Universal groups. These are used to group together a collection of users and global groups from
other domains to represent people who work in the same function or department. Universal groups
operate in a similar way to global groups, but their scope extends beyond the domain in which they
belong.
Active Directory administrators will typically assign users to a global group. This group is then placed in a
domain local group to which permissions are assigned. This approach is referred to as the A, G, DL, P
approach and may be used within the business. The BI operations team should be aware of the available
groups and use them as much as possible to ensure a compliant and coherent security approach to
managing access to a SQL Server.
Encryption
Encryption is the process of converting data into a cipher text format so that it’s unreadable to
unauthorized personnel. Data encryption can be performed by the SQL Server operating system, or by
applications, and can be used to provide an additional level of protection in the following areas:
Column level protection to hide sensitive data, such as credit card information, from internal users.
MCT USE ONLY. STUDENT USE PROHIBITED
3-4 Managing Business Intelligence Security
The BI operations team should consider all of the preceding areas when they are looking to provide a
secure, defense-in-depth approach to the organization’s BI assets. This will involve working with other
teams, such as Active Directory and network administrators, database administrators, and an
organization’s security team, to devise a comprehensive security approach that meets the needs of the
business.
Database
Schema
Object
From a database engine perspective, the common activities involve managing SQL Server logins and
database users. In addition, there is the ability to manage encryption technologies to associate with logins
or database users, or to encrypt the data that resides within SQL Server. The Always Encrypted feature
ensures that data remains encrypted when a high privileged account, such as a database administrator, is
administering the data—so they should not be able to see the data. Furthermore, you can use Row Level
Security to control access to rows in a database table, based on the characteristics of the user executing a
query.
Reporting Services
Access to SQL Server Reporting Services (SSRS) is defined at both the system level and the content level—
access should be controlled by either Windows users or groups, or SQL Server logins. You should also
consider the application protocol that is used to access the report server. For secure data, there is the
option to use HTTPS, or the business might want to use the default HTTP protocol. Finally, settings can be
configured in the RSReportServer.config file and used to mitigate against threats, such as man-in-the-
middle attacks.
Analysis Services
SQL Server Analysis Services (SSAS) uses Windows authentication as the basis of its security model to
authenticate users. After authentication has taken place, Windows users can be made a member of an
Analysis Services role. Permissions can then be assigned to the role to control access to an online
analytical processing (OLAP) database, data sources, a cube, dimensions, dimension members, and specific
cells within the cube. This security model can also be used to secure a data mining structure and data
mining models in traditional Analysis Services.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-5
Integration Services
Integration Services provides a number of security layers to protect the packages that contain the logic to
move data. Package properties and digital signatures can be used to protect all types of SQL Server
Integration Services (SSIS) packages. SQL Server database roles can be used to protect packages that are
deployed to SQL Server; operating system permissions can be used to protect packages that are stored in
the file system.
Security can be applied within Master Data Services (MDS) to control access to the data that is stored
within the entities in an MDS model. You can also use the Super User permission to assign administrative
permissions that enable users to create subscription views, perform version management, or manage the
security of an MDS model. Like Analysis Services, the security of MDS is based on local or Active Directory
domain users and groups.
Data Quality Services (DQS) security is based upon the SQL Server security model, and is managed within
SQL Server Management Studio. SQL logins are added as users in the DQS_MAIN database, and associate
each user with one of the DQS roles to define the permissions.
In the first instance, the BI operations team should be looking to use some or all of these areas to control
access to the data that is held in the BI infrastructure, through authentication and authorization. For
highly sensitive data, encryption technologies and network protocols should be considered to provide an
additional layer of protection for the data. The key is to keep the approach as simple as possible and try
to make use of Windows security as much as possible, whilst meeting business security requirements.
Common Criteria includes the ability to view login statistics. You can enable this and other settings by
enabling the common criteria compliance server configuration option.
MCT USE ONLY. STUDENT USE PROHIBITED
3-6 Managing Business Intelligence Security
To configure common criteria compliance on a SQL Server instance, execute the following code:
SQL Server Audit provides the tools and processes that enable store and view audits on various server and
database objects. Several components work together to audit a specific group of server or database
actions.
The SQL Server Audit object is the component that is responsible for collecting a single instance of the
server or database actions that should be monitored. After this is determined, you should define one
server audit specification per audit that is held in the SQL Server Audit object. The server audit
specification collects many server-level audit action groups raised by the Extended Events feature—this is
a general event-handling system for server systems.
Alternatively, you can define a database audit specification per database that is also stored in the SQL
Server Audit object. The database audit specification collects many database-level audit action groups
raised by the Extended Events feature.
Audit action groups are predefined and include the events exposed by the database engine. Audit action
groups can be defined for both the server level and the database level. These actions are sent to the audit,
which records them in the target.
Reporting Services
When Reporting Services is installed, report execution logging is enabled by default and retains 60 days of
execution information within the ReportServer database, in a table named dbo.ExecutionLogStorage. This
information is exposed in a number of views that are also created by default and named
dbo.ExecutionLog, dbo.ExecutionLog2 and dbo.ExecutionLog3. The data in these views varies slightly but
contains information that includes user name, ReportID and execution time. This can provide the BI
operations team with information regarding who has accessed which report. Not only can this help from
an audit perspective, but it can also provide evidence about which reports are or are not being used.
To change the retention duration of the logs, perform the following steps:
Analysis Services
Analysis Services has a range of tools that can be used to audit access. You can use SQL Server Profiler to
run a trace against an Analysis Services instance, in addition to an Audit Login and Logout event. The
Query Logging feature that is available within Analysis Services identifies the queries that are issued
against a server and the user name that issued the query.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-7
SSIS includes an audit transformation that you can use to create additional output columns within the
dataflow that holds metadata about the package, including the PackageID, PackageName, MachineName
and UserName. In addition, if SSIS is running in project mode, then the SSIS Catalog can also be used to
extract the same information. As SSIS is an automated process for moving and transforming data, it is rare
for auditing to be set up to audit the security.
With no formal in-built auditing, these technologies host databases within a SQL Server Database Engine
instance. You can use SQL Server audit to track access to these databases.
Lesson 2
Security Components
Some common components are configured by an operations team that manages the access to data.
Authentication and authorization are the most common components that will be managed. If best
practice is employed and Windows groups are predominantly used, this will mean that, as new users join
the organization, they will be added to the correct groups that would already have access to the SQL
Server. In this scenario, the BI operations team would only have to deal with exceptions, which should be
considered on a case-by-case basis—members of the team may then have to make security changes.
Lesson Objectives
At the end of this lesson, you will be able to implement:
Policy-based management.
Some BI components support both modes, such as the database engine and Reporting Services. Other BI
technologies only support one authentication mode. For example, you can only authenticate to
Integration Services using Windows authentication.
In addition, Reporting Services extends the authentication capability to use either basic authentication or
a custom forms-based authentication. As Reporting Services is a web application, the extensibility of the
authentication enables it to be embedded within custom applications that may use other forms of
authentication.
The BI operations team must decide on a model. SQL Server and Windows Authentication mode should
be used to enable any existing third-party applications to operate. For high security networks, Windows
Authentication mode may be considered to apply security under the control of the Active Directory
network.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-9
Server Level
To create a SQL Server login with a Windows account, you should perform the following steps:
For other BI technologies, the concept of creating a server login is not applicable. For example, no explicit
server login is required for Integration Services, Data Quality Services or Master Data Services. While
Analysis Services does not require a server login to be created, you will need to define a Windows user as
an administrator. Reporting Services also requires a Windows user or SQL login to be mapped to the
system administrator role. This means that the user can then administer the system.
Database Level
The most common objects to create and manage within a database are database users and database
roles. It is typical to map a SQL Server login to a database user, and then use either the built-in or user-
defined roles to control access to the data within a database.
MCT USE ONLY. STUDENT USE PROHIBITED
3-10 Managing Business Intelligence Security
To create a database user by using SQL Server Management Studio, you should perform the following
steps:
In SQL Server Management Studio, in Object Explorer, expand the Databases folder.
Expand the database where you want to create the new database user.
Right-click the Security folder, point to New, and then click User.
On the General page, in the User name box, type a name for the new user.
In the Login name box, type the name of the SQL Server login to map to the database
user.
Click OK.
You can gain access to other BI services by adding a Windows user or SQL login to a built-in or user-
defined group within the technology—or, as with Analysis Services, permissions can be assigned directly
to a user.
Permissions
Access to objects within any of the SQL Server BI technologies is controlled by permissions that can be
assigned as follows:
When first installed, each SQL Server BI technology contains built-in roles, and the approach of role-based
permission management follows best practice for controlling access. For example, Reporting Services
contains the System Administrator and System User roles at the site level. At the content level, there are
built-in roles that include Content Manager and Browser. Each of these roles contains preset permissions
that determine a user’s level of access. It is best practice to make use of the built-in roles as much as is
possible. If a role does not meet the needs of a BI operations team, it can create a new role, and then
assign the required permission before adding the users.
Keys can either be symmetric or asymmetric. When using a symmetric key, the same key is used to
encrypt and decrypt the data. As a result, the key must be shared by the user or system that encrypts the
data, and by the user or system that decrypts the data. Symmetric keys can be created and protected by a
password or a certificate.
To create a symmetric key, you can perform the following Transact-SQL code:
Creating a symmetric key using the 256-bit Advanced Encryption Standard (AES) that is encrypted
with a password
CREATE SYMMETRIC KEY [SK_MIA-SQL] WITH ALGORITHM = AES_256
ENCRYPTION BY PASSWORD = 'Pa55w.rd';
GO
Asymmetric keys consist of a pair of keys, each of which generates a pair of values. One of the keys is
known as the public key and is associated with another key, known as a private key, which is held by the
private key owner.
To create an asymmetric key, you can perform the following Transact-SQL code:
Creating an asymmetric key using the 2048 bit RSA encryption algorythm that is encrypted with a
password
CREATE ASYMMETRIC [KEY AK_MIA-SQL]
WITH ALGORITHM = RSA_2048
ENCRYPTION BY PASSWORD = 'S4nFr4nc1sc0';
GO
Typically, you would have to install the third-party EKM software that would place an object within
cryptography providers. You would then perform additional steps to encrypt the data at rest, and this is
typically used with a feature known as transparent data encryption (TDE).
Transparent data encryption is enabled on a database level, and performs real-time encryption and
decryption of the data and log files without the need to rewrite applications. A database encryption key
(DEK) is stored in the database boot record for availability during recovery. The DEK is a symmetric key
secured by using a certificate stored in the master database of the server or an asymmetric key protected
by an EKM module.
The transfer of data over a public network, from one site to another, would require protection because
data is being transferred across the Internet. For this reason, Windows Server provides the capability, using
the Routing and Remote Access Server, to create site-to-site virtual private network (VPN) tunnels—when
the VPN is created, encryption protocols can be defined to secure the connection. The IPSec protocol is a
VPN encryption protocol that you can use to protect data over the wire. The configuration of the secure
VPN would be performed by the network administration team. This should be considered if the
movement of data takes place from different sites, when it is being loaded into a data warehouse.
Create a database master key that is encrypted with the password S4nFr4nc1sc0.
Create a certificate named ServerCert with the description used for TDE.
Create a database encryption key that uses the AES_128 algorithm and is encrypted by the
certificate ServerCert.
Enable transparent data encryption on the AdventureWorks database.
USE master;
GO
CREATE MASTER KEY
ENCRYPTION BY PASSWORD = 'S4nFr4nc1sc0';
GO
CREATE CERTIFICATE ServerCert
WITH SUBJECT = 'Used for TDE'
GO
USE AdventureWorksDW
GO
CREATE DATABASE ENCRYPTION KEY
WITH ALGORITHM = AES_128
ENCRYPTION BY SERVER CERTIFICATE ServerCert
GO
ALTER DATABASE AdventureWorksDW
SET ENCRYPTION ON
GO
Policy-based Management
Policy-based management enforces the
configuration of different aspects of one or more
instances of SQL Server. After a policy is defined,
SQL Server enforces the settings within that policy.
This can include forcing the naming conventions
of SQL Server objects and forcing the recovery
model of a database. Policy-based management
can be used to enforce SQL Server standards for
the data warehouse that are in line with the BI
operations team server standards.
Policy. A policy is an object that holds the information required to enforce a policy. A policy consists
of a facet, which represents a SQL Server object, and a condition, which is applied to an object. Finally,
a target is defined—this is typically a condition that defines a server name.
Facet. A facet is an object within a policy that contains properties that relate to a specific SQL Server
object. Facets can include objects such as databases, views, and stored procedures. They can also
represent surface area configuration for SQL Server components, such as the database engine,
Analysis Services and Reporting Services.
Condition. A condition defines a set of allowed states for a facet against a given target. A policy can
consist of only one condition.
Policies can be imported and exported between different instances of SQL Server, in an xml format. You
can replicate policy settings on other instances without the policy being recreated manually.
Question: Does your organization user Active Directory groups or Active Directory users to
access SQL Server BI resources?
MCT USE ONLY. STUDENT USE PROHIBITED
3-14 Managing Business Intelligence Security
Lesson 3
Security Approach for BI Components
Defining a security approach for each BI technology has the benefit of establishing a baseline on which
troubleshooting can be centered, in addition to setting a standard. Common standards should be
followed across all BI technologies, but each technology also has its own unique considerations that need
to be taken into account.
Lesson Objectives
In this lesson, you will describe the security approach for:
Database Engine
Integration Services
Analysis Services
Reporting Services
Data Quality Services
Database Engine
The operational security approach for server and
database access should be conducted using the
following guidelines:
Windows logins can take advantage of Windows groups to organize users into logical groupings. Placing
Windows logins into a Windows group will simplify the administrative effort of managing users for the BI
operations team. Active Directory administrators are responsible for the management of Windows users
and groups. This means that the BI operations team will only be responsible for deciding which user or
group can access the SQL Server instance. Using Windows groups will further simplify the management
from a SQL Server perspective, because there will be less objects to manage.
Create a SQL Server login for each user or application that requires access to the server
Should SQL Server logins be required, you should create a separate SQL login. This will help with the
auditing capabilities of SQL Server.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-15
When a login is created in SQL Server, you can map the SQL login to a database user. The BI operations
team could potentially encounter support issues for users who believe they should have access to a SQL
Server database, but are unable to do so. This scenario occurs when a SQL login is deleted but the
associated database user remains. These users are then referred to as orphaned users.
Make use of the built-in roles as much as possible
SQL Server provides a wide range of built-in server and database roles that contain a default set of
permissions to control access to resources. For example, the database role db_datareader enables users to
read data in the database. The objective is to simplify the management of access control. It is best practice
not to modify the default permissions for built-in roles; if a built-in role does not meet your access control
requirements, you should create a new role and document it as an accepted standard.
Manage permission assignment and exceptions through a help desk management tool
Requests for security access should be managed through a help desk management tool. The BI operations
team can use this to record details of a request and their response to it. Exceptions in permissions requests
that are granted should be recorded in an exceptions document, along with some evidence of a signed-
off business approval.
The decisions that are made regarding the guidelines should provide a baseline that can be incorporated
into standards.
The benefit of deploying to SQL Server is that it provides an additional layer of security that can be used
to protect the packages. The file system deployments rely only on the NTFS security that is provided by
Windows.
The SSIS Catalog provides an ssis_admin role to manage all SSIS operations, and an ssis_logreader role to
enable members to view the logs that are generated by SSIS.
For the msdb database, there are three fixed database-level roles—db_ssisadmin, db_ssisltduser, and
db_ssisoperator—for controlling access to packages that are saved to the msdb database. By default, the
sysadmin SQL Server role is a member of the db_ssisadmin role. The db_ssisltduser role can view all
packages but only manage their own packages; the db_ssisoperator role can only view the packages
stored in the MSDB database.
Package protection levels control access to the contents that are held within a package. This can involve
securing the entire package contents or only sensitive data, such as connection information. The
mechanism for providing this protection can be achieved by using a password or user key, or relying on
SQL Server storage.
6. To save the updated package, click Save Selected Items on the File menu.
Integration Services can then be configured to check for the existence of a certificate within Tools and
Options in the Integration Services option. Within this is an option to check digital signatures when
loading a package.
also be used to secure data mining structure and data mining models.
The BUILTIN\Administrators Windows group is a member of the Analysis Services administrators’ role by
default. In other versions of SQL Server, no Windows groups are added to this role. As a result, during the
installation of Analysis Services, you can configure the member of this role so that only the intended
administrators are made a member. If not performed during the installation, this can be updated within
SQL Server Management Studio.
You can use SQL Server Analysis Services to create roles in both Visual Studio and SQL Server
Management Studio.
Creating roles within SQL Server Management Studio means changes are performed on the deployed
system and will take effect the next time the user connects to the Analysis Services instance. Note that the
change that is made by using SQL Server Management Studio will not be reflected within the Visual
Studio project—it could result in the roles being overwritten or removed should the deployment of the
project define that role members are dropped and created. As a result, you can define the standards for
the creation of database roles within Analysis Services.
If a role provides access to an object, then a member of that role has access to the object, regardless of
whether they are explicitly denied access to the object in another role. As a result, the deny permission is
overridden.
Use predefined system and item level roles to fulfill security requirements
In Reporting Services, two default system roles are created—system administrator and system user. The
available default item level roles include Browser, Content Manager, Publisher, My Reports, and Report
Builder. These roles should be used as much as possible to facilitate permissions management. These roles
should also be used to fulfill as much of the access control as possible.
MCT USE ONLY. STUDENT USE PROHIBITED
3-18 Managing Business Intelligence Security
You can create user-defined roles for security requirements that cannot be met by the predefined roles.
Predefined roles should not be modified as this can increase the complexity of resolving access control
issues.
It is best practice to make use of these roles to manage access control of the DQS.
Model security
Hierarchy permissions
Use the SQL Server administrators as the MDS
administrator
MDS server security defines which user can have full control of the server, the functional areas, models
and hierarchies. By default, this is populated with the user who has installed MDS and is part of the
administrators group on the local server. You should replace this account with the Windows group that
represents the team that manages the operations of the Master Data Server, and then assign the
functional area of Super User.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-19
Depending on the number of users who manage and use MDS, you can create a Windows group that
represents the functional areas that this group will manage. The available functional areas include:
Explorer
Version Management
Integration Management
System Administration
Use existing groups or create Windows groups for each model area and hierarchy data
An MDS model is a container for MDS entities that store the master data. For MDS estates that are used
by multiple users, you can use existing Windows groups to control access to the model data.
Lesson 4
The Security Approach in Different BI Environments
Different environments will have differing security conditions to meet the required functional needs.
Production environments will be subjected to tight security policies to ensure protection of the data and
technologies on which the data is hosted. In a nonproduction environment, such as a development
environment, users may have elevated permissions to ensure that they can successfully carry out their role
of developing BI solutions.
Lesson Objectives
At the end of this lesson, you will be able to describe security in:
Production environments.
Nonproduction environments.
Production Environments
The production environment should be subjected
to the tight security controls—signed off by the
business—that have been outlined in this module.
Not all practices that have been covered in this
module may be used, but there should be clear
documentation that outlines the security in the
following areas:
Physical security
Authentication
Authorization
Encryption requirements
Auditing policies
Service maintenance
The BI operations team should support the BI solution in line with the security strategy that has been
documented. In addition, exceptions to the standards defined within the security documentation should
also be documented and signed off.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-21
Nonproduction Environments
When you consider the security requirements for
nonproduction environments, you need to achieve
a balance between providing protection for the
data and making sure that users can perform their
role. Each business will take advice from their
security team before deciding how they should
approach security.
Test environments
A test environment may contain a subset of production quality data to provide testers with real-world
scenarios. As a result, security in the test environment will be stricter than that of the development
environment. Access should only be given to those users who have to test, and those who have to validate
the tests. It is not uncommon for testers to have select, insert, update and delete permissions for the data
in these environments. A subset of developers may have select permissions for the data within these
environments.
UAT environments
UAT environments should closely replicate the settings that are used within a production environment. In
these environments, a small group of business users are invited to explore the solutions and data in the
environment, and then provide feedback. It is best practice for all users to be given a dedicated test
account to perform the testing. This ensures that there is differentiation from their production accounts
and security between the two environments is separate. Testers might have select permissions for the UAT
environment; developers typically do not receive access to the environment.
You have been asked to demonstrate some of the key security points raised with the team at Adventure
Works. Specifically, you should show the difference between a Windows and a SQL login, and how to
create a database user from a login. You have also been asked to demonstrate how to set up security in
Analysis Services and Reporting Services.
Objectives
After completing this lab, you will be able to:
Password: Pa55w.rd
2. Set the SQL Server Authentication Mode on MIA-SQL to SQL Server and Windows Authentication
5. Create Database Users in the AdventureworksDW Database from the Windows Group
DL_ReadSalesData and Grant Select Permission to the Sales Schema.
Task 2: Set the SQL Server Authentication Mode on MIA-SQL to SQL Server and
Windows Authentication
1. Open SQL Server Management Studio.
3. Set the authentication mode on MIA-SQL to SQL Server and Windows Authentication if this mode
is not already active.
2. Grant Select permission to the SalesReaders user over the EDW schema in the EIM_Demo database.
Created a database role and added a database user within the role.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 3-25
2. In SQL Server Management Studio, connect to the Reporting Services MIA-SQL\SSRS instance.
3. Create a new system level role named SecurityAdmin and assign the manage report server security
permissions.
Question: Which best practices for security would you envisage being able to implement
when you return to your own workplace?
Question: At any point, did the lab not follow security best practice?
MCT USE ONLY. STUDENT USE PROHIBITED
3-26 Managing Business Intelligence Security
Module 4
Deploying BI Solutions
Contents:
Module Overview 4-1
Lesson 1: Application Life Cycle Management for BI Solutions 4-2
Module Overview
Deploying BI solutions is a discrete part of the BI development life cycle. The BI operations team will be
called upon to support the development team during the deployment. The aim is to successfully create
the solution within a production environment for operational use. The presence of nonproduction
environments provides the opportunity to practice the deployments before they are conducted on a
production server, so that the deployments can run smoothly.
A variety of tools and practices can be used to aid deployments. Each method has its own benefits and
can be used in any environment. Understanding the tools that are available and the benefits they offer
will help you to pick the right tool for the job and aid deployment.
Objectives
At the end of this module, you will understand:
Stand-alone deployments.
Team-based deployments.
MCT USE ONLY. STUDENT USE PROHIBITED
4-2 Deploying BI Solutions
Lesson 1
Application Life Cycle Management for BI Solutions
The deployment of a BI solution does not stop with the completion of a major BI project. After users start
to gain value from the solution, the BI development team will be asked to add new functionality to further
increase that value. The job of the BI operations team is to support the development team in performing
the deployments, so that there is minimal disruption to the live solution.
Therefore, an understanding of the BI development life cycle, and how source control and Team
Foundation Server (TFS) can help with this, is important to the success of managing the BI life cycle. The BI
operations team can help to ensure that the business receives the updates in an efficient manner.
Lesson Objectives
At the end of this lesson, you will be able to describe:
Testing. Testing provides the first opportunity to test the deployment mechanisms that will be used
in all of the environments. It is important for the BI operations team to be involved at this stage.
There may be resistance to this, but the process of deploying the solution to an environment should
be tested, so that the team can identify and deal with any problems that might occur. In this phase,
the objective is for the BI operational team to define standards for deployments.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-3
Deployment. By the time that a deployment occurs in a production environment, standards should
be set from the deployments that have taken place in nonproduction environments. There are
different methods to deploy a solution but the aim is for the deployment to be automated as much
as possible, so that it can occur seamlessly. This capability should also be practiced in a
nonproduction environment. It is also best practice to create a documented build guide and rollback
document to support the production deployment.
approach to delivery, where the development of BI functionality occurs sequentially, and the output is for
a discrete portion of functionality, which is clearly defined against a given schedule. Alternatively, an agile
approach may be used by organizations to deliver a plethora of functionality across different parts of a
team at the same time.
Regardless of the approach, it is prudent to determine a period within the schedule where there is a code
freeze. The BI operations team can then collate the checked-in versions of the code into a build; produce
a build guide that describes the actions to take during the deployment; and then continue with a release
by deploying the code through a variety of methods. This does not stop developers from working,
providing they do not check in new BI functionality during the code freeze. TFS provides the functionality
for developers to “shelve” changes. In this scenario, development efforts are saved to the TFS server, but
are not committed as a version to the TFS database. This means that the BI operations team can continue
to manage a release and build without disruption to the developers.
The BI operations team can use a range of technologies to manage deployments of the solution. It is
important that this team communicates the process and the steps for the deployment, in addition to any
rollback strategy that may be required, should a deployment fail.
Lesson 2
Stand-alone Deployments
Stand-alone deployments are useful to developer teams that are deploying to nonproduction
environments, or for single individual teams deploying to a production environment. SQL Server and
Visual Studio® provide a wide range of tools that can support stand-alone deployments of both
databases and BI components. You can also use automation to automate the process of deployment
through a range of languages.
Lesson Objectives
At the end of this lesson, you will be able to perform:
DACPAC deployments
BACPAC deployments
A BI developer can create a database on their local instance. After the database creation is complete, the
following steps can be performed to create a data tier application for deployment to another instance:
1. Open SQL Server Management Studio and connect to the instance that contains the database.
2. In Object Explorer, expand the Database node, right-click a database, point to Tasks, and click
Extract Data-tier Application.
3. In the Extract Data-tier Application window, on the Introduction page, click Next.
MCT USE ONLY. STUDENT USE PROHIBITED
4-6 Deploying BI Solutions
4. On the Set Properties page, type a name for the application, and then type a version number under
Version; optionally, you can add a description, and then under Save to DAC package file, browse to
a location to store the file. Optionally, you can select the check box to overwrite an existing package
with the same name, and then click Next.
To deploy a data tier package, you should perform the following steps:
1. Open SQL Server Management Studio and connect to the instance that contains the database.
2. In Object Explorer, right-click the Database node, and click Deploy Data-tier Application.
3. In the Deploy Data-tier Application window, on the Introduction page, click Next.
4. On the Select a Package page, browse and select the dacpac package previously created, and click
Next.
5. On the Update Configuration page, under Name, optionally change the name of the database, and
then click Next.
6. On the Summary page, click Next.
7. On the Deploy DAC page, wait for the deployment to complete, and then click Finish.
On completion, the data tier application deployment will create the database and the objects that are
contained within the database, but it will not contain data.
BACPAC Deployments
If a developer is concerned about deploying tables
and their associated data, an alternative approach
to using a DACPAC is to create a BACPAC. The
fundamental difference between the two is that a
BACPAC contains both the schema and the data
when created and deployed.
To create a BACPAC, you should perform the
following steps:
1. Open SQL Server Management Studio and
connect to the instance that contains the
database.
2. In Object Explorer, expand the Database node, right-click a database, point to Tasks, and click
Export Data-tier Application.
3. In the Export Data-tier Application window, on the Introduction page, click Next.
4. On the Export Settings page, in the settings tab, select the radio button next to either Save to local
disk, or Save to Windows Azure™, and browse to a storage location.
7. On the Operation Complete page, when the build is complete, click Close.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-7
To reuse the BACPAC on another SQL Server instance, you should perform the following steps:
1. Connect to the instance of SQL Server, whether that is on-premises or in the Azure SQL Database.
2. In Object Explorer, right-click Databases, and then select the Import Data-tier Application menu
item to launch the wizard.
4. On the Import Settings page, select the radio button next to either Import from Local Disk or
Import from Windows Azure, and browse to a storage location, and then select the BACPAC. Click
Next.
5. On the Database Settings page, change the name of the database, and then click Next.
Clustered index
Nonclustered index
Spacial index
Unique index
Logins
Users
Database roles
Role memberships
Permissions
Schemas
Statistics
Synonyms
Check constraint
Collation settings
Columns
Computed columns
Default constraint
Unique constraints
MCT USE ONLY. STUDENT USE PROHIBITED
4-8 Deploying BI Solutions
DML triggers
Tables
Views
If a database contains objects that are not supported, then an error is returned in the wizard.
The deployment settings for a database project can be defined by right-clicking the database project, and
then clicking Properties.
In the Project Settings page, you can define the following settings:
Target platform. This defines the version of the SQL Server platform that the project files are
intended for.
Output types. This provides the ability to generate a DACPAC file and/or a SQL script that can be
used to deploy to an instance of SQL Server in the Project Settings page.
General. This defines the default schema in the database and whether to include the schema name in
the filenames that are generated.
You can use the additional pages that are available to define how to manage SQLCLR, debugging levels
and builds with pre- and post-deployment scripts that are important to team deployments.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-9
SQL Server provides the capability to perform incremental package deployments. This ensures that
selected packages can be deployed from a project, rather than having to have all of the packages stored
in a project. The process to create a deployment .ispac file still exists and requires the SSIS project
properties to be set to define the location for the .ispac file by performing the following steps:
1. In Visual Studio, right-click the project, and click Properties.
2. Note that in the General page under Configuration Properties, in the Configuration drop-down list,
it states Active(Development).
3. In the Build page, specify the output path, which by default is set to bin.
4. In the Deployment page, set the Server Name and the Server Project Path.
5. Optionally, set additional options in the Debugging page, and click OK.
After deployment, you can browse to the output path in the project file location. If the default value is
used, a folder named “bin” will appear. Within the bin folder, another folder named “development” will
appear, to reflect the configuration that was formed. The .ispac file will now be ready to be deployed.
To deploy the SSIS solution, double-click the .ispac file and perform the following steps:
3. In the Packages Folder Name, browse to the location of the SSIS project files to display a list of
packages, and then select the packages to deploy. Optionally, specify any passwords and click Apply,
click Refresh , and then click Next.
4. In the Select Destination page, type the server name and path, then click Next.
Visual Studio can be used to deploy an Analysis Services project—it builds a complete set of XML files in
an output folder that has the commands required to build all of the Analysis Services database objects in
the project. A number of properties can be set in the Analysis Services project properties, under the
Deployment page, including:
Process Options. This option determines whether the cube is processed as it is deployed.
Transactional Deployment. This determines whether the deployment is a transaction and will roll
back the deployment should it fail.
Deployment Mode. This specifies whether all, or only the changes, of the Analysis Services objects
are deployed.
Server. This is the name of the server to which the Analysis Services solution is deployed.
Database. This is the name of the database in which the Analysis Services solution is deployed.
Alternative methods for deploying an Analysis Services solution include the Analysis Services Deployment
Wizard, Backup and Restore of Analysis Services databases, and XMLA scripting.
MCT USE ONLY. STUDENT USE PROHIBITED
4-10 Deploying BI Solutions
Some properties must be set to ensure the successful deployment of reports. Some of these options can
be defined within the report wizard. For reports that are not created within the report wizard, you can use
the reporting services project properties to set the same and additional options by right-clicking an SSRS
project and clicking Properties. Within the deployment options there are a number of settings, including:
Overwrite data sources | Overwrite Datasets. By default, this is set to false. If set to true, any data
sources or datasets that are edited within the report project will overwrite any existing data sources or
datasets on the report server.
TargetServerURL. This is the web address for the report server on which the reports will be located.
Note that, when deploying to SharePoint® services, you must specify a URL with the report folder
and the data sources folder.
TargetServerVersion. This specifies the version of Reporting Services that the reports are intended
for.
Once the settings are complete, you can right-click the SSRS project and click Deploy.
3. In the Data Quality Client home screen, open a knowledge base in the Domain Management activity.
4. In the Domain Management page (with any tab selected), click the Export Knowledge Base data
icon above the Domain list, and then click Export Knowledge Base.
5. In the Export to Data File dialog box, go to the folder in which you want to save the file. Name the
file or keep the knowledge base name, keep DQS Data Files (*.dqs) as the Save as type, and then click
Save.
3. In the Data Quality Client home screen, click New Knowledge Base.
5. Click the down arrow for Create Knowledge Base from, and then select Import from DQS file.
7. In the Import from Data File dialog box, go to the folder that contains the .dqs file that you want to
import, and then click the name of the file. Click Open and then click Next.
8. Select the activity that you want to perform, and then click Create.
9. Click Publish to publish the knowledge in the knowledge base, and then click OK.
10. In the Data Quality Services home page, verify that the knowledge base is listed under Recent
knowledge bases.
Master Data Services provides a number of tools with which you can manage the movement of a model
between different instances of SQL Server, depending on your requirements. If you need to move both
the model structure and its data, you can use the MDSModelDeploy tool to create a package. You can
then use this package to create a new model, create a clone of a model, or update an existing model and
its data. This will affect the command used when deploying the model to the server. If the requirement is
to only move the structures, you can use the Model Deployment Wizard.
To create a package using the MDSModelDeploy tool, you should perform the following steps:
3. To determine the name of the service you will deploy to, type MDSModelDeploy listservices in the
command prompt, and then press Enter.
4. To create a package named Agents, using MSModelDeploy, from the model named Insurance, using
the PreProd version from the service named MDS that includes data, type the following command:
MDSModelDeploy createpackage -modelname Insurance -version PreProd -service MDS -
package Agents –includedata, and then press Enter.
To deploy a package to another server using the MSModelDeploy tool, you should perform the following
steps:
1. Open the command prompt, using Run as Administrator.
3. To determine the name of the service you will deploy to, type MDSModelDeploy listservices in the
command prompt, and then press Enter.
4. To deploy a new model, type the following command in the command prompt:
MDSModelDeploy deploynew -package Agents -model Insurance -service DefaultWebsite
MCT USE ONLY. STUDENT USE PROHIBITED
4-12 Deploying BI Solutions
5. To clone the model from the package, type the following command in the command prompt:
MDSModelDeploy deployclone -package Agents
6. To update an existing model, type the following command in the command prompt:
MDSModelDeploy deployupdate -package Agents -version PreProd
Scripted Deployments
You may want to make as many deployments to a
production server as automated as possible—to
remove the human interaction that could lead to
mistakes being made. A number of options are
available to facilitate this, including:
For example, you could use SQLCMD to refer to sql scripts that can be executed in a specific order. If the
development team is disciplined enough, they may keep a script of all the database objects that have
been created. In this case, a SQLCMD command can be used to call the sql scripts one by one in an
orderly manner. Alternatively, if a naming scheme has been defined for the sql scripts, they could be
executed in bulk.
For example, all tables created in a Staging database may have sql files that begin with STG stored in a
folder named C:\DBObjects\Staging. This means that a batch file could be created and stored in the
C:\DBObjects\Staging folder that executes all of the files beginning with STG against the local server in a
database named Staging, using the following command:
You can generate Extensible Markup Language for Analysis Services (XMLA) scripts by using the
deployment wizard that allows you to execute the script in SQL Server Management Studio, or use the
ASCMD utility to execute the XMLA in a scripted manner. The XMLA script created from the deployment
wizard will recreate the database objects that are defined within the script file.
The XMLA script consists of settings that you can use to create the Analysis Services objects. It could also
contain the settings required to process the Analysis Services database and the objects found in the script.
You can use any editor to edit the XMLA script to add custom objects through the XMLA language, when
you have stored the XMLA script in a saved file.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-13
To create an XMLA script using the deployment wizard, perform the following steps:
1. Click Start, type Deployment Wizard, and then click the Deployment Wizard icon.
2. On the Welcome to the Analysis Services Deployment Wizard page, click Next.
3. On the Specify Source Analysis Services Database page, browse to the folder location in the
Database file box and select the .asdatabase file, and then click Next.
4. On the Installation Target page, change the value in the Database box to define the name of the
database, and then click Next.
5. On the Specify Options for Partitions and Roles page, you can optionally create partitions and
roles, and then click Next.
6. On the Specify Configuration Properties page, specify any required configuration options and click
Next.
7. On the Select Processing Options page, select the desired processing options and click Next.
8. On the Confirm Deployment page. Select the Create deployment script check box, browse to a
location to store the XMLA file, and then click Next.
9. On the Deploying Database page, wait for the deployment script to be completed, and then click
Next.
11. The XMLA script file will appear in the folder defined in the wizard.
The dtuil command can be used to manage SSIS packages. This can include moving and copying
packages to SQL Server or to a folder location in Windows. You can also use the tools to delete the SSIS
package. When moving SSIS packages to SQL Server, you might be required to define credentials in the
command; without this, dtutil will attempt to execute the command as the user executing the command.
The following example uses dtuil to copy a package named DWLoad that is stored in the msdb database
on a local instance of SQL Server using Windows Authentication to the SSIS Package Store in a folder
named ETL.
This example copies a local file system SSIS package named DWLoad located in the C: to an instance of
SQL Server named Seattle.
Lesson 3
Team-Based Deployments
The ability to deploy a BI solution to different environments is important to the BI development life cycle.
This operation can be done by using the build and release management capability of TFS. After you have
become familiar with this, you can streamline deployments to different environments using the same code
base. This will provide consistency in the solutions, and is particularly important for UAT and production
environments.
Deployments can sometimes go wrong so it is important that the management of builds and releases is
supported with an appropriate rollback strategy. Having a rollback strategy in place will ensure that you
can bring an environment back to a known good state if required.
Lesson Objectives
In this lesson, you will learn about:
Team Foundation Server (TFS).
Release Management. This involves collating the correct code that is part of the release and
ensuring that the correct settings are defined for the environment in which the release will take place.
Build Management. This provides the ability to create a technical definition of a build definition that
will be used to deploy the code that is defined within a release.
Rollback Management. This involves the steps and technologies used to manage the rollback of a
deployment to a known state for the purpose of ensuring continued operations of a given
environment. Hopefully, a rollback strategy will not be required. However, if a build does fail to
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-15
deploy, the BI operations team should be able to restore the environment to an operational state as
quickly as possible.
The operations team should also collate a build guide that contains release notes—these should provide
an overview of the functionality that is part of the release. The build guide should contain the detailed
steps on how the deployment will be performed. There should also be documentation that refers to the
rollback approach that would be undertaken should the deployment fail.
2. In Team Explorer, right-click the solution or folder you want to branch, point to Branch and
Merging, and then click Branch.
3. In the Branch dialog box, type a name, and then click OK.
4. A new item, which has the name defined in the Branch dialog box, will appear in Team Explorer.
5. Open the solution file within the new branched code and work as normal.
When the time comes to define a release, a branch of the code can be merged back into the master
branch of code ready to be deployed. For items where there is a conflict, a window appears that enables
you to accept or reject changes in code from either the master or the branch.
Merging Code in Team Explorer
To merge branch code back into the main code, perform the following steps:
1. In Team Explorer, right-click the solution or folder that is the branch, right-click and point to Branch
and Merging, and then click Merge.
2. In the Merge dialog box, select the branch to merge, and select the main code to be merged into in
a name, and then click OK.
3. Open the solution file for the main code to validate that the merge has been successful.
MCT USE ONLY. STUDENT USE PROHIBITED
4-16 Deploying BI Solutions
The release management in TFS is performed in the TFS web portal that can be accessed from Team
Explorer. This means that you can set a release definition that will contain the following information:
The development objects that make up new releases.
The settings are contained within a release definition that will be managed by a TFS build agent—a
service that is responsible for executing the release definition. To define an agent, go to the TFS web
portal, click Administer server, and then select the agent pool tab.
Tasks. This is used to define the solution that will be part of the build and the build steps in a specified
build. Additional properties can be configured, including the platform, configuration and Visual Studio
version where the solution is targeted. You can also specify the source of the working folder that stores
the code; for example, Team Foundation Version Control, GitHub or Subversion.
Variables. You can add variables to the build that can be used within the project. By default, this will
include system variables such as BuildPlatform and TeamProject. You can also add user defined variables,
which are specific to the solution created, to the build management.
Triggers. This determines how a build is to be executed. Multiple triggers can be set up and configured
to execute in one of two ways:
History. This retains a history of the builds, and has a Diff command that can look at the differences in
each build created.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-17
For a build to be queued, a build agent must be installed on the TFS server. When this is done and a build
definition is set, it can be managed in Team Explorer—this includes the ability to view the builds that are
queued or completed, edit the existing build definition, and carry out other tasks, including:
Delete a build.
Secure a build.
As an alternative, build automation can be provided by using MSBuild.exe and the scripts for the project.
Demonstration Steps
Open the Builds Console in Internet Explorer
4. In the Builds window, click New Build Definition. Internet Explorer starts.
Download the Build Agent
4. In the Internet Explore notification bar, click Save drop-down arrow, and then click Save as.
3. At the Enter Server URL prompt, type the following text, and then press Enter:
http://mia-sql:8080/tfs
4. At the Enter authentication type prompt, press Enter to accept the default value of Integrated
authentication.
MCT USE ONLY. STUDENT USE PROHIBITED
4-18 Deploying BI Solutions
5. At the line Enter agent pool prompt, press Enter to accept the default value.
6. At the Enter agent name prompt, accept the default value (MIA-SQL), and press Enter.
7. At the Enter work folder prompt, type D:\Demofiles\Mod04\agent\_work, and then press Enter.
8. At the Enter run agent as service? prompt, type Y, and then press Enter.
9. At the Enter the user account to use for the service prompt, type AdventureWorks\ServiceAcct,
and then press Enter.
10. At the Enter Password for user account AdventureWorks\ServiceAcct prompt, type Pa55w.rd,
and then press Enter.
Rollback Strategies
You should not overlook the BI operation team’s
ability to roll back the BI code that is deployed,
should an error occur with the release and build of
the solution. In this situation, the team can
respond by performing one of the following three
steps to mitigate a failed deployment:
If time permits, fix the issue and perform
another release. This approach can work in
situations where there is a long time window
to perform a deployment. For example, some
organizations may not operate at weekends,
so releases are performed on Friday night. In
such circumstances, the release manager might say that there would be enough time to fix an issue
and redeploy the solution. The risk is that, what seems to be a simple issue, might be a symptom of a
bigger problem. This would mean that there is not enough time to fix and deploy—in which case, the
next rollback option would still have to be used.
Undo changes and redeploy a previous release. You can adopt this pragmatic approach to return
the code base of a BI solution to a known state. The downside is that no new functionality will be
deployed as the deployment is effectively postponed. This option should be used in scenarios where
there is a small time window for a deployment.
Continue with a deployment and apply a hotfix. Some deployments may contain errors that do
not affect the functionality of key components of the BI solution, but do affect a discrete part. In such
scenarios, the BI operations team might apply a temporary fix for the deployed solution in an
environment, with the intention of retrospectively fixing the issue in the code for the next release. This
approach works well for anticipated or known errors in a deployment.
If a rollback is necessary, it is important to inform the data director or manager, so that the appropriate
communications can be sent out within the business.
Question: What rollback strategies do you need to employ if a production deployment fails?
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 4-19
You are a consultant working with the BI operations team to improve the operational management of
their current BI solution. The BI team have requested that you demonstrate how databases can be
managed between different environments using DACPACs and BACPACs. They also want a demonstration
on how builds can be managed using Team Foundation Server. Finally, they want to look at the different
examples of managing deployments.
Objectives
At the end of this lab, you will show:
How to create a DACPAC from SQL Server Management Studio and Visual Studio.
How to create a build in Team Foundation Server.
Password: Pa55w.rd
3. Connect to the MIA-SQL instance of Analysis Services and execute the AW_SSASScript.xmla script.
3. Verify that the EIM_Demo_DW_Load package has been added to the MIA-SQL Integration Services
instance.
Manually deployed a DACPAC that has been part of a Team Foundation Server Build.
Question: What conclusions can be drawn from the various methods of deployment that are
available?
Module 5
Logging and Monitoring in BI Operations
Contents:
Module Overview 5-1
Lesson 1: The Need for Logging and Monitoring 5-2
Module Overview
The main aim of any operations team is to ensure the continued service of key applications that are used
within the business—more organizations are seeing a BI solution as a critical application for ensuring
success. Therefore, the BI operations team should implement a proactive approach to overseeing the
general health of the servers and services that are being used.
This will involve employing a number of technologies that can log the operations of a service to
proactively manage any potential problems that are identified. There will be times when the BI operations
team will have to be reactive, using monitoring tools to help identify the root cause of any potential
issues.
Objectives
After completing this module, you will be able to:
Describe the need for logging and monitoring.
Lesson 1
The Need for Logging and Monitoring
Logging and monitoring plays an important role in BI operations management. SQL Server® provides a
variety of tools that you can use to retrieve information about the configuration and performance of your
SQL Server components.
Even in environments where rigorous standards and processes are applied to managing the BI
infrastructure, there will be situations where the BI operations team will need to investigate and identify
the root cause of a reported issue.
Using the available logging and monitoring tools, in addition to your understanding of how the
environment is configured, can provide an overall picture of the processes that are affecting your system
and will help you to solve problems efficiently. Some of the tools can also be used to identify areas that
can be improved.
Lesson Objectives
After completing this lesson, you will be able to describe:
The BI operations team should, therefore, categorize the servers that they support, based on their
importance to the business. Mission critical servers that will cause business disruption could be
categorized as Tier 1 servers. You can set service level agreements to determine the amount of time that
Tier 1 servers can be offline, or the amount of time that is allowed for a fix to be put in place. Other
categories can then be defined for other types of servers. This will dictate the level of support, and the
amount of logging and monitoring that will be performed on each tier of servers.
Types of Logging
You can use a variety of tools to log events and
activities across many of the products in the
Windows and SQL Server stack. After a logging
option is set up, it will typically run constantly in
the background while the service is operational.
The intention is to collect information that logs
the activity that has been occurring against the
service; to log any errors or exceptions; or to log
the queries that are being executed against the
service. The logging options include:
Windows logging. Windows Server®
provides event logs that enable you to review
a history of information regarding the system, its applications, and its security. You can use the event
log to see if the cause of an issue is related to Windows or to a specific application. Event logs begin
automatically when a Windows Server is started.
SQL Server logging. SQL Server contains an error log that records the error so that you can
troubleshoot problems that are specific to SQL Server. The log file starts automatically at the same
time as the SQL Server instance. You can also configure how many log files are retained. In addition,
the SQL Server Agent that is typically used to automate the execution of BI tasks keeps a log of the
execution history.
Integration Services logging. Integration Services provides a variety of options that enable you to
log standard events, or create custom logging events to monitor the progress of the execution of a
package. The types of logging available include:
Analysis Services logging. Analysis Services provides extensive logging options that can record
many aspects of a server’s activities, including query performance and processing performance. The
logging options available to SSAS include:
o Query logging
o Error logging
o Exception logging
o Traces
MCT USE ONLY. STUDENT USE PROHIBITED
5-4 Logging and Monitoring in BI Operations
o Flight recorder
Reporting Services logging. Reporting Services provides logging capability that means you can view
any setup issues, and track the execution of reports through user or scheduled activities. The available
options for logging include:
Data Quality Services logging. Data Quality Services provides three types of log files for
troubleshooting any issues that may arise with the Data Quality Server, Data Quality Client, and the
DQS cleansing component found in SQL Server Integration Services.
Master Data Services logging. The Master Data Services Web.config file contains a tracing section
that means you can capture the activity that is occurring on Master Data Services and store it in a log
file.
Logging is useful for providing information about the general health of the SQL Server. There are so many
options to logging the activity of SQL Server BI components that it is important for the BI operations team
to be selective. Certain logging options, such as event logs and SQL Server error logs, are mandatory and
cannot be turned off. Other types of logging have to be configured before they can be used. A decision
must be made on which logging should be set up to provide the team with valuable information to help
them debug potential problems. Another consideration is that logging can consume additional resources.
Therefore, it is important to ensure that the logging setup does not have an excessive impact on server
resources.
Types of Monitoring
Monitoring tools are typically executed in
response to information that is found through
logging, but the logging solution does not provide
enough detail with which to solve the issue. You
can use monitoring to capture more granular data
about a particular area of Windows or a SQL
Server. Monitoring will consume resources that
can have an impact on the operation of a server.
As a result, monitoring is performed as a reactive
measure to dig deeper into an issue. However, you
can run many of the monitoring tools proactively
to capture more information, if the server on
which it runs can handle the load of the monitoring tool. The following monitoring tools are available:
Windows monitoring. Windows Server provides Windows Performance Monitor to deliver real-time
information about hardware, services, and various SQL Server components, including SSIS, SSAS, and
SSRS that are installed on the server. Windows monitoring can also be set up, so that you can capture
information during a defined, targeted time period.
SQL Server monitoring. SQL Server provides a variety of monitoring tools, such as Extended Events
and SQL Server Profiler, that can be used by different SQL Server components. In addition, you can
use the Transact-SQL language to query information that is stored about the SQL Server activity by
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-5
using dynamic management views (DMVs). Activity Monitor can also provide real-time information
about the connections that are running against an instance of a SQL Server. You can also use
Execution Plans to monitor the performance of Transact-SQL queries that are running against the
database.
Integration Services monitoring. Integration Services installs a number of objects within Windows
Performance Monitor to provide real-time or collected performance data. You can also query the SSIS
Catalog to monitor the real-time execution of packages on an SSIS server.
Analysis Services monitoring. Analysis Services also installs a number of objects within Windows
Performance Monitor. In addition, Extended Events can be used to monitor the performance of cubes.
You can also use SQL Server Profiler to monitor an Analysis Services instance.
Reporting Services monitoring. Reporting Services installs a number of objects within Windows
Performance Monitor to monitor the activity of the Report Server Web Service, and the Report Server
Windows Service.
Data Quality Services monitoring. Data Quality Services provides an activity monitor in the Data
Quality Services client to display information about the activity occurring on the client.
Master Data Services monitoring. Trace logging is used for both monitoring and logging.
Importance of Baselining
A proactive approach to a logging and monitoring
strategy is to collect enough information to
understand the operational behavior of your SQL
Servers and the resources that are being
consumed during different periods of the business
life cycle. This is known as establishing a baseline.
Understanding the baseline operational
performance of a SQL Server will help you make
informed decisions about particular activities that
occur.
You will need to implement the logging and monitoring options over the periods of time that best reflect
the business process, and collect the information from the tools to establish clear baselines. For example,
the data may inform you that it is normal for the SQL Server memory to peak at 90 percent consumption
overnight, due to the activity of loading a data warehouse. However, memory consumption that occurs
during nondata warehouse load periods would be deemed unacceptable.
Question: What tools do you use to baseline the performance of your BI technologies?
MCT USE ONLY. STUDENT USE PROHIBITED
5-6 Logging and Monitoring in BI Operations
Lesson 2
Logging Options
SQL Server provides many logging techniques to help you keep track of a wide range of areas that can
impact security, performance, and operations. You can also configure the tools to focus on specific events,
such as warnings and errors, or failed and successful events.
Different logging tools offer different benefits—some provide broad information about a technology,
whilst others focus on a specific aspect of a technology. Some of these tools provide unique benefits but
there are also limitations that the BI operations team should be aware of. An informed decision can then
be made about which tools to employ when logging the operations of a SQL Server BI estate.
Lesson Objectives
After completing this lesson, you will understand the options for:
Windows logging.
SQL Server logging.
Windows Logging
Windows event logs can provide useful
information regarding both the general health of
the SQL Server components and the Windows
operating system. It will log informational
messages, in addition to errors and warnings.
There is also a log to record security events that
occur on the system. The following three core log
areas would be of interest to the BI operations
team:
Application logs. SQL Server will log informational, warning, and error events in this log. Each event
that is logged contains a severity level—a severity level of 19 or above is typically logged with error
events and should be taken seriously by the BI operations team.
Security logs. Security logs keep a record, or an audit, of security related events. These are typically
recorded as successful or failed events. The security log will not audit every single event. Usually,
additional configuration is required within Windows or SQL Server to make use of security logging.
When a SQL Server component is installed on a server, it will record general events to the application log.
The BI operations team can use the Event Viewer application to view, filter and manage the events that
are stored within the event logs. To access the Event Viewer, click the Windows Start button, type Event
Viewer, and then click the Event Viewer application.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-7
In the Event Viewer application, the logs described above will appear in the Windows Logs node. You can
right-click a log and go to its properties to configure settings, such as the size and log storage location for
the event log, in addition to the retention period for logged data. The log size and log retention setting is
an important consideration for the BI operations team. You need to configure these settings so that
enough retention of the log data is available should you wish to troubleshoot a situation. There could be
a scenario where a server is so active that the logged data could fill up very quickly, causing old logged
data to be removed.
You can also interact with the Event Viewer Log by right-clicking a log and choosing one of the following
options:
Filtering can help the team focus on the specific items that exist within a log—for example, you could
filter by an application, an event type, or a date range. Being able to save a log means you can clear the
log, should you wish to record events from a particular starting point.
When an issue is first reported, the BI operations team should use the event logs found in Windows to see
the general health of both the Windows system and SQL Server. Serious issues for SQL Server will be
recorded in the application log, so looking there can provide the first clues as to where an underlying
problem is occurring.
To view the SQL Server error logs in SQL Server Management Studio, you should perform the following
steps:
1. Open SQL Server Management Studio, and connect to the database engine instance.
2. In Object Explorer, expand the instance, expand Management, and then expand SQL Server Logs.
4. To view a log, right-click a log file, and then click View SQL Server Log.
MCT USE ONLY. STUDENT USE PROHIBITED
5-8 Logging and Monitoring in BI Operations
After a SQL Server log is opened, you can interact with the log file in a similar way to Windows event logs
by filtering the log data, exporting the data, and searching through the log data. If the team prefers, you
can work directly with the log data that, by default, is stored in the following location:
Seven files appear, the first named ERRORLOG, and subsequent files named ERRORLOG.n, where n is
equal to the number of the log file that is displayed in SQL Server Management Studio. Being able to view
the log files in a text editor is useful in an event where you cannot access SQL Server.
Like event logs, the BI operations team should use SQL Server error logs to provide an overview of the
health of SQL Server and its components. The BI operations team can configure the amount of error log
files that can be retained by the SQL Server by right-clicking SQL Server Logs in Management Studio and
clicking Configure. This means that the team can ensure that enough logging information is retained,
because high volume environments will cycle through the logs very quickly, and in extreme cases, may
only hold error information for the previous few hours. It might be necessary to configure this option so
that information can be retained for troubleshooting a SQL Server.
SQL Server Setup Logs
Log files are also created during setup. If setup fails or succeeds but shows warnings or other messages,
you can examine the log files to troubleshoot the problem. This applies to all SQL Server components that
are set up. Each execution of setup creates log files with a new time-stamped log folder at C:\Program
Files\Microsoft SQL Server\140\Setup Bootstrap\Log. The time-stamped log folder name format is
YYYYMMDD_hhmmss. This file can be opened in a text editor. It is typical to perform a search of the word
“error” to take you to the location within the file that contains any error messages.
5. Under Logging level, select the logging level, and then click OK to execute.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-9
Basic. The default value that logs all information except for diagnostic information.
RuntimeLineage. Collects information that tracks the lineage of the data as it runs through the
different tasks in a package execution.
Performance. Only performance statistics about the package are collected against warning and error
events.
Verbose. All events, including diagnostic events about the package execution, are collected.
SQL Server also enables you to create customized logging levels that collect only the statistics and events
that you want. When you run a package, you can select a customized logging level wherever you can
select a built-in logging level.
To create a custom logging level, you should perform the following steps:
1. In SQL Server Management Studio, right-click the SSISDB database and select Customized Logging
Level.
2. To create a new customized logging level, click Create, and then provide a name and description.
3. On the Statistics and Events tabs, select the statistics and events that you want to collect.
4. On the Events tab, optionally select Include Context for individual events, then click Save.
The new execution logging level can be selected the next time a package is executed.
The information that is collected from the package execution is stored in tables within the SSISDB Catalog.
Views that are stored in the database can be used to return information about the execution of packages,
event statistics, and the project settings and environments that are used. This means you can build queries
that return information about the execution and performance of a package.
In SQL Server, the ssis_logreader database role has the ability to read the log data in the catalog. In
prrevious versions of SQL Server, this could only be done by the ssis_admin role. However, the permissions
associated with this role are considered too great for users who are just required to read the log data in
the SSIS Catalog.
If package execution is started from the SQL Server Agent, you can use the information from the SQL
Server Agent history, along with the SSISDB views, to provide an evidenced-based picture of how the
packages are performing.
MCT USE ONLY. STUDENT USE PROHIBITED
5-10 Logging and Monitoring in BI Operations
o Query Log. The query log records information about the queries used against the Analysis
Server. The query log does not capture the full MDX query that is sent to the server. Instead, it
captures a numeric list of the hierarchies and attributes used in each dimension, such as
01,00000010200000,100,00,100,12000. Each comma separates the level numbers between
dimensions. The server can use this list to find out which hierarchies were accessed and at what
level, so it can optimize its aggregates without having the details of the query. This information
can be stored in a SQL Server table and/or a file.
Additional Analysis Services logging capabilities include traces, exception logging, and the flight recorder.
However, it is recommended that these methods of logging should only be used after taking advice from
Microsoft.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-11
Logging is turned off by default, but can be enabled by including the following code in the
ReportingServicesService.exe.config file:
The configuration file is stored in the C:\Program Files\Microsoft SQL Server Reporting
Services\SSRS\ReportServer\bin folder and can be opened in a text editor.
Execution Logging
Information is recorded in the ReportServer database when a report is accessed and executed. Reporting
Services provides three views with the ReportServer database—you can view the times for retrieving,
processing, and rendering a report. It also provides information about the user who accessed the report
and when the report was accessed.
The ReportExecutionLog views can provide insightful information regarding the performance of reports,
and in particular, which aspect of the report retrieval is the slowest. The BI operations team can use these
views to identify problem reports through simple querying—the team can also identify which reports are
not being accessed.
The Reporting Services report server trace log file provides verbose information about the Report Server
web service, Report Manager, and the Reporting Services Windows service.
MCT USE ONLY. STUDENT USE PROHIBITED
5-12 Logging and Monitoring in BI Operations
The log file must be enabled by adding the following section above the RSTrace section in the
ReportingServicesService,exe.config file:
A number of values can be used in the values property to determine the level of logging that would be
used, including:
0: Disables tracing.
Additional settings can be configured in the web.config file, including filename and path, which
determines the location and filename of the trace file. After configuration, you can read the file in a text
editor.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-13
Data Quality Services includes log files for the Data Quality Server, the DQS client and the Data Cleansing
component that is used within SSIS.
The Data Quality Server log file is named DQServerLog.DQS_MAIN.log and, by default, is located in
C:\Program Files\Microsoft SQL Server\MSSQL14.MSSQLSERVER\MSSQL\Log. The Data Quality Client log
file is named DQClientLog.log and, by default, is located in %APPDATA%\SSDQS\Log. The Data Quality
Client log file contains similar information as the server log file, but from a client-side perspective. The
DQS Cleansing component log file is named DQSSSISLog.log and, by default, is located in
%APPDATA%\SSDQS\Log. All files can be opened in a text editor.
All of the log file sizes can be configured in the Advanced settings of Data Quality Services Client. They are
rolling files, with a new log file created when the existing log file exceeds the specified size limit.
Lesson 3
Monitoring Options
By adopting a correctly executed monitoring strategy, you can further investigate any issues that are
impacting a BI solution but have only been partially identified or resolved by the logging approach. With
monitoring, you can narrow the search to a specific problem area by using one of the many tools that are
provided by SQL Server and Windows.
Monitoring is typically used on a case-by-case basis to further assist in the identification of an issue. This
means you can troubleshoot an issue further with a view to better understanding the root cause of a
problem. By understanding the capabilities of the tools that are available, you will pick the right one for
the job.
Lesson Objectives
After completing this lesson, you will be able to monitor:
The operating system.
Windows Performance Monitor. You can use Windows Performance Monitor for a broad
monitoring solution that encompasses Microsoft Windows, SQL Server and the hardware that hosts
these components. Windows Performance Monitor can be used to provide real-time information
about hardware, services, and components on a physical server. It consists of objects that describe a
component or area of the operating system, such as the CPU or the memory object. When an
instance of SQL Server is installed, the installation will add objects to Windows Performance Monitor,
such as SQL Server: Databases and SQL Server: Transactions. Each object contains counters that will
measure a specific part of the object. For example, in the memory object, the available MBytes
counter can be used to monitor the amount of free space in the memory. In the SQL Server:
Databases object, the Percent Log Used counter can indicate how full a transaction log file is. Some
counters can also be broken down into instances. For example, the Percent Log Used counter can
select an instance of a database that you wish to monitor—such as the AdventureWorks database.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-15
You can also create a data collector set that will store the real-time performance data from Windows
Performance Monitor in a file for later review.
Activity Monitor
Activity Monitor provides real-time information
about the connections and processes against a
SQL Server. It can also provide information on the
most recent expensive queries. The data cannot be persisted and should only be used to confirm real-time
information about connections. You can also use Activity Monitor to kill connections against the instance.
To access Activity Monitor, right-click the instance in SQL Server Management Studio, and then click
Activity Monitor.
SQL Server Profiler is a graphical tool you use to monitor many aspects of SQL Server, including
transactions and stored procedures. You can use SQL Server Profiler to monitor these components in real
time—this is known as a trace. However, you typically store the results of a profiler trace in a file or table
for later review.
When creating a profiler trace, you must first set up the general properties of the trace, specifying a trace
name and whether or not to save the trace information to a trace file, or to a SQL table. Predefined
templates are available that contain some predefined events to monitor. To view all the available events,
select the blank template. A trace stop time can also be defined in the general properties.
Should you choose a blank template, you can use the Events tab to define the events that you want to
record. These events are organized into categories that represent a component of SQL Server. The trace
file can be stopped manually at any time. After the trace has been created, it can be replayed and
reviewed in SQL Server Profiler.
Typically, SQL Server Profiler is used for the database engine to monitor the queries that are executing
against the instance of the SQL Server. The following events could be profiled:
You can use the results of a trace file with other SQL Server tools, such as the Database Engine Tuning
Advisor. You can also integrate system monitor files within Profiler to correlate information between the
two tools if they have been running at the same time.
Transact-SQL Queries
You can use a range of Transact-SQL queries against system tables or DMVs to return useful information
to a BI operations team.
DMVs can be used to monitor the health and performance of a SQL Server. When SQL Server is started,
telemetry regarding its operations is stored within system tables. The BI operations team can query the
information stored by using DMVs that can be viewed in the View node of SQL Server Management
Studio—the DMV view name begins with sys.dm_*. If the SQL Server has been running for a long period
of time, the information that is captured in DMVs can be extremely valuable when you are trying to
establish the overall health of the SQL Server. When a SQL Server is restarted, the contents are purged.
The first question that should be asked of SQL Server before even querying the DMVs is:
The result of the query will return the number of days the server has been online, and it will tell the BI
operations team just how useful the information will be. The longer the uptime of the server, the more
useful the information is, because the server will have been through a range of business cycles, including
month-end and quarter-end periods.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-17
Even if the server uptime is low, you could use the following system query to determine the amount of
tempdb data file and CPU cores that are configured in the instance, by using both system tables and
DMVs:
Using Transact-SQL queries to answer the questions that the BI operations team will be asking when there
is an issue, provides the data-driven approach required to troubleshoot problems with a BI solution.
To view the history of a job, you should perform the following steps:
1. Open SQL Server Management Studio, and connect to the database engine instance.
2. In Object Explorer, expand the SQL Server Agent, and then expand Jobs.
It is not uncommon for SQL Server integration tasks to be scheduled using the SQL Server Agent. In
addition, Reporting Services uses the SQL Server Agent to execute scheduled snapshots, caches and
subscriptions. Therefore, understanding how to view the history of a SQL Server Agent job is important
when you are solving execution issues that are reported to the BI operations team.
MCT USE ONLY. STUDENT USE PROHIBITED
5-18 Logging and Monitoring in BI Operations
System Processor N/A Indicates how many threads are waiting for execution
queue against the processor. If this counter is consistently
length higher than around 5 when processor utilization
approaches 100 percent, this is a good indication that
there is more work (active threads) available (ready for
execution) than the machine's processors are able to
handle.
System Context N/A Measures how frequently the processor has to switch
switches/sec from user to kernel mode to handle a request from a
thread running in user mode. The heavier the workload
running on your machine, the higher this counter will
generally be but, in the long term, the value of this
counter should remain fairly constant. However, if this
counter suddenly starts increasing, it may be an
indication of a malfunctioning device, especially if the
Processor\Interrupts/sec\(_Total) counter on your
machine shows a similar, unexplained increase.
Processor % processor _Total Measures the total utilization of your processor by all
time and running processes. If you have a multiprocessor system,
individua you should know that only an average is provided.
l cores
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-19
Processor % privileged _Total Used to see how the OS is handling basic IO requests.
time If kernel mode utilization is high, it is likely that your
machine is underpowered—it is too busy handling
basic OS housekeeping functions to effectively run
other applications.
Processor Interrupts/se _Total The average rate, in incidents per second, at which the
c processor received and serviced hardware interrupts.
Should be consistent over time but a sudden
unexplained increase could indicate a device
malfunction that can be confirmed using the
System\Context Switches/sec counter.
Memory Pages/sec N/A Indicates the rate at which pages are read from or
written to disk to resolve hard page faults. This counter
is a primary indicator of the kinds of faults that cause
system-wide delays, and is the primary counter to
watch for any indication of possible insufficient RAM to
meet your server's needs. A good idea here is to
configure a perfmon alert that triggers when the
number of pages per second exceeds 50 per paging
disk on your system. You may also want to see the
configuration of the page file on the server.
Physical Disk Disk For each If it goes above 10 disk I/Os per second, you have a
transfers/sec physical poor response time for your disk.
disk
Physical Disk Idle time For each If the number of disk transfers per second is above 25
physical disk I/Os per second, you should use this counter. It
disk measures the percentage of time that your hard disk is
idle during the measurement interval—if you see this
counter fall below 20 percent, it is likely that you have
read/write requests queuing up for your disk, which is
unable to service these requests in a timely fashion.
Physical Disk Disk queue For SQL A value that is consistently less than 2 means that the
length Server disk system is handling the I/O requests against the
and physical disk.
Analysis
Services
Disks
MCT USE ONLY. STUDENT USE PROHIBITED
5-20 Logging and Monitoring in BI Operations
Network Current For each This is an estimate of the current bandwidth of the
bandwidth network network interface in bits per second (bps).
card
MSAS 2016: Memory N/A Shows (as a percentage) the high memory limit
Memory limit high KB configured for SSAS in C:\Program Files\Microsoft SQL
Server\MSAS13.MSSQLSERVER\OLAP\Config\msmdsrv.i
ni.
MSAS 2016: Memory N/A Shows (as a percentage) the low memory limit
Memory limit low KB configured for SSAS in C:\Program Files\Microsoft SQL
Server\MSAS13.MSSQLSERVER\OLAP\Config\msmdsrv.i
ni.
MSAS 2016: Memory N/A Displays the memory usage of the server process.
Memory usage KB
MSAS 2016: File store KB N/A Displays the amount of memory that is reserved for the
Memory cache. Note that, if total memory limit in the
msmdsrv.ini is set to 0, no memory is reserved for the
cache.
MSAS 2016: Queries from N/A Displays the rate of queries answered directly from the
Storage cache cache.
Engine direct/sec
Query
MSAS 2016: Queries from N/A Displays the rate of queries answered by filtering an
Storage cache existing cache entry.
Engine filtered /sec
Query
MSAS 2016: Queries from N/A Displays the rate of queries answered from files.
Storage file/sec
Engine
Query
MSAS 2016: Current N/A Displays the number of connections against the SSAS
Connection connections instance.
MSAS 2016: Requests/sec N/A Displays the rate of query requests per second.
Connection
MSAS 2016: Current lock N/A Displays the number of connections waiting on a lock.
Locks waits
MSAS 2016: Query pool N/A The number of queries in the job queue.
Threads job queue
Length
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-21
MSAS Temp file N/A Shows the number of bytes of data processed in a
2016:Proc bytes temporary file.
Aggregation written/sec
s
SQL Server: Buffer N/A The amount of memory that is used by data flow tasks
SSIS pipeline memory that are executing at the time of monitoring.
SQL Server: Buffers N/A The amount of memory that is written to disk.
SSIS pipeline spooled
SQL Server: Rows read N/A The total number of rows that are used as inputs into
SSIS pipeline SSIS data flows.
SQL Server: Rows written N/A The total number of rows that are outputs of the SSIS
SSIS pipeline data flows.
SQL Server: SSIS package N/A The total number of SSIS packages executing.
SSIS service instances
Report Memory N/A Reports the current pressure that the memory is under,
Server pressure with the following settings returned:
Service state 1: no pressure
2: low pressure
3: medium pressure
4: high pressure
5: exceeded pressure
SQL Server: Buffer cache Instance The percentage of data that was read from memory,
Buffer hit ratio rather than the disk.
Manager
SQL Server: Page life Instance The average length of time that data will remain in
Buffer expectancy memory before being ejected. A low value can indicate
Manager excessive memory pressure.
2. In the Performance Monitor console, on the left pane, ensure that Performance is selected.
5. In the Add Counters dialog box, in the Select counters from computer list, ensure that <Local
computer> is selected.
6. Scroll through the counters list, expand the Memory node, click Available MBytes, counter and then
click Add, and then click OK.
Profiler
Profiler can be used to monitor either the database engine or Analysis Services. It can be used to perform
real-time monitoring of the component in question or, alternatively, you can save the real-time
monitoring to a trace file for later review. Running SQL Server profiles is extremely resource intensive and
should only be used when the server in question can handle the load. However, the data that is returned
could help you to pinpoint the cause of an issue on a server, if the correct activities to monitor are chosen.
To run SQL Server Profiler, click Start on the Windows desktop and type SQL Server Profiler.
When you are profiling Analysis Services, the monitoring of the following events can help you to
troubleshoot query and processing performance:
Server Errors Shows when error messages are returned by the Analysis Server.
Query Query cube Shows the individual requests for data that can be used within
Processing begin the Usage Based Optimization.
Query Get data Shows the queries that are returning results from the
Processing from aggregations stored in the cubes.
aggregation
Query Query Shows the individual requests for data in a more readable
Processing subcube format.
verbose
Query DAX query Shows the query plan information for queries in tabular data
Processing plan models.
Query Direct query Shows when direct query mode is being used in tabular data
Processing begin models.
Query Vertipaq SE Shows the individual requests for data against a tabular data
Processing query cache model.
match
1. On the Windows desktop, click Start, type Profiler, and then click SQL Server Profiler.
2. On the menu bar, click File, point to Templates, and then click New Template.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-23
3. In the Trace Template Properties dialog box, next to server type, click the drop-down, and select
SQL Server 2016 Analysis Services.
4. Next to name, type a name for the template.
5. Click the Events Selection tab, and configure the events to include in the template.
SSIS Catalog
From Microsoft CodePlex, you can download SSIS reporting packs that contain prebuilt reports that query
the SSIS Catalog to provide operational information about package execution. The SSIS reporting pack
can be found at https://ssisreportingpack.codeplex.com/.
Reporting Services
Reporting Services provides execution logs that are covered in the next module.
Demonstration Steps
1. Start Microsoft SQL Server Management Studio.
2. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.
3. To open the Activity Monitor, in Object Explorer, right-click MIA-SQL, and then click Activity
Monitor.
6. To open the Data File I/O section, click Data File I/O.
7. To open the Recent Expensive Queries section, click Recent Expensive Queries.
8. To change the refresh interval, right-click anywhere in the Overview section, point to Refresh interval,
and then click 1 second.
Demonstration Steps
1. Click Start, type Performance, and then click Performance Monitor.
2. To view the list of data collector sets, in the Performance Monitor window, on the left pane, click
Data Collector Sets.
MCT USE ONLY. STUDENT USE PROHIBITED
5-24 Logging and Monitoring in BI Operations
3. To create a new data collector set, expand the Data Collector Sets node, right-click User Defined,
point to New, and then click Data Collector Set.
4. In the Create New Data Collector Set wizard, on the How would you like to create this new data
collector set? page, in the Name box, type a name such as SQL Monitoring.
6. On the What type of data do you want to include? page, select the Performance counter check
box, and then click Next.
7. On the Which performance counters would you like to log? page, click Add.
8. In the dialog box, in the Available counters section, expand the Processor node, scroll down, click
%Processor Time, and then click Add.
9. Scroll up and expand the PhysicalDisk node, scroll down and click Avg. Disk Queue Length, and
then click Add.
10. Scroll up and expand the Memory node, scroll down and click Available MBytes, and then click
Add.
11. Scroll down and expand the SQLServer:Databases node, click Active Transactions, and then click
Add.
14. On the Where would you like the data to be saved? page, click Next.
15. On the Create the data collector set? page, ensure that Save and close is selected, and then click
Finish.
16. In the Performance Monitor window, on the right pane, right-click SQL Monitoring, and then click
Start.
Demonstration Steps
1. On the taskbar, click the File Explorer shortcut.
2. View the contents of the D:\Demofiles\Mod05 folder.
4. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
5. Click Start, type SQL Server Profiler, and then click SQL Server Profiler 17.
6. In the SQL Server Profiler window, on the File menu, click New Trace.
7. In the Connect to Server dialog box, in the server type, select Database Engine. In the Server name
list, ensure that MIA-SQL is selected, and then click Connect.
8. In the Trace Properties dialog box, in the Trace name box, select the text, and type
QueryMonitoring.trc.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-25
10. In the Trace Properties dialog box, select the Save to file check box.
14. On the Events Selection tab, expand the TSQL node, scroll down, and select the SQL:BatchStarting,
SQL:BatchCompleted, SQL:StmtStarting, and the SQL:StmtCompleted check boxes.
16. In the Edit Filter dialog box, in the list, click DatabaseName.
17. Expand the Not like node, type Master, and then click OK.
19. In the Organize Columns dialog box, in the list, click DatabaseName, and then click Up until
DatabaseName appears under the Groups node, and then click OK.
20. To run the trace, in the Trace Properties dialog box, click Run.
23. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.
24. In Object Explorer, expand the Databases node, expand the AdventureWorks node, and then
expand the Tables node.
25. Right-click Person.Address, and then click Select Top 1000 Rows.
26. Right-click HumanResources.Employee, and then click Select Top 1000 Rows.
27. Right-click HumanResources.JobCandidate, and then click Select Top 1000 Rows.
29. Right-click Production.BillOfMaterials, and then click Edit Top 200 Rows.
30. Additional tables may be selected to provide more information to SQL Server Profiler.
32. Return to SQL Server Profiles, expand the AdventureWorks database name, and review traces for the
work performed.
33. In the toolbar, click Stop Selected Trace, and then close SQL Server Profiler.
Question: Are there any barriers to you running monitoring tools on production servers?
MCT USE ONLY. STUDENT USE PROHIBITED
5-26 Logging and Monitoring in BI Operations
Lesson 4
Setting Up Alerts
You can use SQL Server to create and configure alerts. Alerts enable you to define a condition, and when
the condition is met, an action will be performed. This can either involve alerting members of the BI
operations team about errors with a particular component, or invoking a job that could remediate the
error that is occurring. Using alerts is a prudent way to provide a first response to an issue—the
operations team should create alerts that will help them to respond immediately to any issues that arise.
Lesson Objectives
After completing this lesson, you will have created:
3. To create an operator, in Object Explorer, expand the SQL Server Agent node, right-click Operators,
and then click New Operator.
6. Under the Pager on the duty schedule, select the Monday, Tuesday, Wednesday, Thursday, and
Friday check boxes.
Demonstration Steps
1. Start SQL Server Management Studio.
2. In the Connect to Server dialog box, ensure that the Server name is set to MIA-SQL, and then click
Connect.
3. To create an operator, in Object Explorer, expand the SQL Server Agent node, right-click Operators,
and then click New Operator.
7. Click OK.
8. Close SQL Server Management Studio without saving any changes.
SQL Server alerts can be set up to respond to three types of scenarios, but typically two scenarios are
used. These can include:
SQL Server events. For SQL Server events, alerts can be set up to alert an operator or start a job in
response to SQL Server error numbers, or error severity levels.
MCT USE ONLY. STUDENT USE PROHIBITED
5-28 Logging and Monitoring in BI Operations
SQL Server Agent alerts provide a useful feature that enables the BI operations team to be proactive in the
management of the BI estate. You should consider which servers to set up alerts for, based on the tier of
server they are assigned from an operational point of view.
Demonstration Steps
1. Start SQL Server Management Studio.
2. In the Connect to Server dialog box, ensure that the server name is set to MIA-SQL, and then click
Connect.
3. To create a SQL Server alert, in Object Explorer, expand the SQL Server Agent node, right-click
Alerts, and then click New Alert.
4. In the New Alert window, in the Name box, type Log File size for EIM_Demo.
10. To define a response, in the Select a page pane, click Response, and then select the Notify
operators check box.
11. In the Operator list, select the E-mail check box.
12. To execute a job, select the Execute job check box, and then click New Job.
13. In the New Job window, in the Name box, type Backup EIM Log, and then in the Select a page pane,
click Steps, and then click New.
14. In the New Job Step window, in the Command box, type:
18. In the first E-mail list, select BIOperators; in the second E-mail list, ensure that When the job fails is
selected.
19. On the Notifications page, select the Write to the Windows Application event log check box; in
the list, select When the job fails, and then click OK.
20. In the New Alert window, in the Select a page pane, click Options, select the E-mail check box, and
in the Additional notification message to send box, type generated by an alert from the SQL
Server, and then click OK.
Adventure Works employees are increasingly frustrated by the time it takes for business reports to
become available on a daily basis. The existing managed BI infrastructure—including data warehouses,
enterprise data models, and reports and dashboards—are valued sources of decision-making information.
However, users are increasingly finding it takes too long for the data to be processed in the overnight
load, resulting in reports not arriving to business users until the early afternoon.
You have been asked to support the BI operations team in devising a logging and monitoring solution
that will help Adventure Works understand what could be causing the issues that they are experiencing.
To that end, you will set up the logging and monitoring process in preparation for execution when
troubleshooting the issues in the environment.
Objectives
At the end of this lab, you will have set up:
General logging and monitoring.
Password: Pa55w.rd
This will involve configuring the Windows event logs to store up to 50 MB of information and the SQL
Server error logs to capture information over 14 log files. You will need to configure a data collector that
captures information about all of the components, and a custom SSIS log that can be used to log and
monitor errors and warnings regarding the overnight SSIS load. You will also create a SQL Server Agent
job that captures the execution history of the EIM_Demo_DW_Load package when it executes.
Object Counter
Memory Pages/sec
A data collector.
An SSIS custom log.
Task 1: Creating a SQL Server Profiler Template for the Database Engine
Create a SQL Server Profiler template that contains events to monitor queries that occur against the
SQL Server instance. Accept the default columns when selecting the events.
Question: Would you have added any additional monitoring tools to the approach laid out
in this module?
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 5-33
Question: Which graphical tool would you use to identify locking and blocking that
currently exists on the server, with a view to terminating a process that is causing the
blocking?
MCT USE ONLY. STUDENT USE PROHIBITED
5-34 Logging and Monitoring in BI Operations
Module 6
Troubleshooting BI Solutions
Contents:
Module Overview 6-1
Lesson 1: Troubleshooting Failed BI Solutions 6-2
Module Overview
The task of trying to troubleshoot failed BI solutions can be complex. It requires an understanding of the
environments in which the BI solution is hosted, and an understanding of the workloads that take place
during the life cycle of the solution. Troubleshooting can be made easier if the BI operations team has
established defined standards for different tiers of servers for the configuration, security, and deployment
of the solution. Standards create a baseline environment for the servers and the solution so that the BI
operations team have a clear understanding of the environment that they are troubleshooting.
With this in place, when an issue is reported to the operations team, they can adopt a structured
troubleshooting approach that means they can resolve the issue, and understand the root cause—this
leads to a long-term fix. As these issues are occurring within live environments, it is prudent to follow a
process that is in line with operational procedures, so that you set the expectations for resolving an issue.
This will typically involve applying a fix that follows either standard operating procedures or emergency
operating procedures.
Objectives
At the end of this module, you will know the correct approach for troubleshooting:
Failed BI solutions
Data Warehouse
Analysis Services
Reporting Services
MCT USE ONLY. STUDENT USE PROHIBITED
6-2 Troubleshooting BI Solutions
Lesson 1
Troubleshooting Failed BI Solutions
Many aspects of a BI solution can fail—including a data warehouse load not completing, the Analysis
Services processing taking too long, or reports showing data that is out of date. Even if the loading of a BI
solution succeeds, the BI operations team may get service desk requests from users with problems,
including the inability to access an individual report, or reports taking too long to generate. A structured
troubleshooting approach means an operations team can:
Understand the symptoms of the problem.
Lesson Objectives
After completing this lesson, you will understand:
Troubleshooting Approach
A good troubleshooting approach should have in
place a broad logging and monitoring method
that will record information about the general
health of Windows®, SQL Server® and its related
components. Making use of the tools that provide
this information, such as Windows event logs, and
SQL Server error logs, will be the starting point for
identifying the areas that are having problems. In
certain circumstances, the answer to the problem
may lie in these log files—such as a hard disk
going offline, or a data warehouse database
running out of disk space. Otherwise, more
investigation may be required that will involve using other tools, and talking to the user who first raised
the issue.
The area of SQL Server that is experiencing problems will determine the type of tool that you use to
perform more targeted investigations. For many SQL Server components, you can use Windows Reliability
and Performance Monitor, using counters specific to the SQL Server component to perform further
analysis. SQL Server Profiler could be used to capture events that relate to the database engine or Analysis
Services. Custom logging in Integration Services may help in providing custom information about SQL
Server Integration Services (SSIS) data loads. Using Transact-SQL to query the report execution logs may
provide answers to slow running reports in SQL Server Reporting Services (SSRS). The aim of this targeted
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-3
analysis is to find evidence as to the root cause of a reported issue, and then discuss with the BI
operations team a proposed fix and the impact of applying that fix to an environment.
This type of checking can be difficult in fast paced operational environments, where it is important to fix
an issue quickly. In such circumstances, it might be more pragmatic to apply a short-term fix to a
problem, and then perform a retrospective root cause analysis, with a view to applying a long-term fix.
The findings and supporting evidence should be documented, and associated against the service desk
ticket that raised the issue. This means the team can decide whether the issue could be problematic on
other servers within the operational environment—the fix could then be applied as a standard to all the
servers.
Alternatively, the root cause analysis might conclude that a proposed fix may not provide a long-term
solution, and that the scenario will resurface when a particular set of conditions occur on a server. In this
situation, it might be pragmatic to define an operating procedure for dealing with the reoccurrence. This
will provide the team with a signed-off agreement on how to deal with the issue in the future, and reduce
the time it takes to respond to the issue. Should the server require being placed offline when applying a
fix, then an emergency operating procedure would be defined; otherwise, a standard operating procedure
would be defined.
Does the fix require a server to be offline, and does this have a financial impact on the business?
If possible, the BI operations team should try to reproduce the issue in a nonproduction environment, and
perform a dry run through of applying the fix. This will give valuable information as to the outcome of
applying the fix, and further inform the impact analysis. In some organizations, a dedicated support
environment—that mirrors the production environment—may be made available for this purpose.
Check the task manager for the running processes and the current performance. Task manager
can provide a quick snapshot of the current performance and the processes that are running on the
server. You can view the processes tab to quickly assess the memory, CPU, disk, and network
consumption of a given process. You can use this to provide the first clue as to the main area of focus.
Confirm the hardware configuration by running MSInfo32 or viewing System Information. This
can be a particularly important check for virtual servers, because it confirms that the hardware
specifications are performing as expected. There may be a situation where a virtual server has its
configuration changed unknowingly. For example, where the memory previously allocated to the
virtual server has been reduced, this would affect performance.
View the system log in event viewer. The system log in event viewer provides useful information
about the operating system and the hardware. You should filter the system log to display errors and
warnings for the time period since the error occurred.
These cursory checks should not take more than five minutes to perform, but the information provided
will help you to quickly identify where you should focus your troubleshooting efforts, and eliminate the
operating system or hardware as the cause of the problem.
MCT USE ONLY. STUDENT USE PROHIBITED
6-6 Troubleshooting BI Solutions
Lesson 2
Troubleshooting the Data Warehouse
The data warehouse provides the foundations for the BI solution. To ensure it operates successfully, it
depends on a number of SQL Server components. The main components are the database engine (to
store the data), Integration Services (to move and transform the data), and the SQL Server Agent (to
initiate the data warehouse execution loads). As a result, troubleshooting will typically start in these areas.
If you adopt a structured approach to troubleshooting, you can narrow down the extent of your
investigations.
Lesson Objectives
At the end of this lesson, you will be able to troubleshoot:
SSIS packages.
The data warehouse.
SQL Server Agent jobs can be configured to trigger a notification when an agent job fails, succeeds, or
completes. You should configure this for key jobs that execute a data warehouse load. Notifications are
sent to an operator using either an email message or a pager. It is typical to send an email message to an
operator to notify them of a failure, so that a team member can be informed immediately.
A view of the Jobs node in the Object Explorer Details window will show you which jobs have failed, as
denoted by a red cross on the SQL Server Agent job. You can view the details of the history in the log file
viewer information—this includes the date, message, log type, and log source of the job. The amount of
history that is held is determined by how the properties of SQL Server Agent are set. In this area, you can
define the file size of the log viewer, and specify a time period when history will be removed from the log.
The information provided by the history will indicate which package and task failed. For custom logging
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-7
activities, it is important to set the logging level within the SQL Server Agent job to ensure that the
logging occurs.
Look at the default views that are available within the SSISDB. When an SSIS Catalog is deployed
to SQL Server, and SSIS packages are deployed to the SSIS Catalog, a record of the package execution
is held inside the SSISDB. Many views can be used in the SSISDB database to retrieve information
from the SSISDB.
The following code can be used to retrieve the error messages that are recorded by an SSIS package
execution, by querying the operational_messages and operations view in the SSIS Catalog:
You can create your own custom Transact-SQL statements that return the information that you require
when troubleshooting failed packages:
If configured, look through the SSIS custom logs. You can use SSIS to select the level of logging
that is used when a package is executed. This can include the default logging levels, or any custom
log that has been created. You can use the reports that are generated to receive targeted information
regarding any errors that will occur with the packages that are executed.
If using, review the SSIS report packs. If you have downloaded the SSIS report pack from Microsoft
CodePlex, a dashboard is created in Reporting Services that will provide information about executed
packages. You can use this dashboard to identify which packages have failed.
MCT USE ONLY. STUDENT USE PROHIBITED
6-8 Troubleshooting BI Solutions
You can also use the following SQL script to confirm the memory setting on a SQL Server instance:
Check for fragmentation levels of the data warehouse. High fragmentation levels found in a
database are an indication that there is a lack of maintenance. Over time, if maintenance is not
performed on a database, query processing will take more time and other operations will slow down.
You can use the following query to provide information on the level of fragmentation of tables and
indexes within a database:
Identify any long running queries. You can use SQL Server Profiler to capture the queries that have
been running on the server. You can then analyze the results to identify any long running queries that
have been occurring during the execution of Profiler. As an alternative, and if the server has been
running for a lengthy period of time, you can also use DMVs to provide an average execution time for
queries.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-9
An example may include executing a script against the dm_exec_query_stats and the sys.dm_exec_sql_text
DMV to identify the top 10 long running queries.
Lesson 3
Troubleshooting SQL Server Analysis Services
From an operational perspective, SQL Server Analysis Services can have issues in one of three main areas.
The cube processing may be slow; the query performance of the cube may be slow; or users may have
issues when trying to access the cube. Before performing in-depth investigations of the Analysis Services,
it is important to perform the operating system and data warehouse troubleshooting steps, because
Analysis Services has a dependency on these areas.
Once you are satisfied that the dependency technologies are not affecting Analysis Services, you can then
use a number of tools to provide information to help you resolve the common operational issues.
Lesson Objectives
After completing this lesson, you will be able to troubleshoot:
Cube processing.
Cube query performance.
Cube access.
In addition, you can run the following query to return the suggested missing indexes to create in a
database:
It is important to note that the previously-mentioned query may make multiple index suggestions for the
same table. Additionally, it is important that multiple indexes are not created on the same table to the
detriment of performance, and that the indexes implemented are tested.
Cube processing time-out. There may be occasions where the processing of the cube fails due to a
query time-out and the following error is returned: OLE DB error: OLE DB or ODBC error:
Operation canceled; HY008. This occurs as the result of a time-out expiry due with the
SQL_QUERY_TIMEOUT setting, meaning the command time-out or query time-out threshold was
reached, and the running query was cancelled. In this scenario, you can modify the server’s advanced
properties—specifically, the ExternalCommandTimeOut property—and increase the value to provide
more time for the processing queries to execute.
MCT USE ONLY. STUDENT USE PROHIBITED
6-12 Troubleshooting BI Solutions
QueryLog\ QueryLogSampling. This specifies the query log sampling rate. The default value for this
property is 10, meaning that one out of every 10 server queries is logged.
QueryLog\ QueryLogConnectionString. This specifies the connection to the query log database.
QueryLog\ QueryLogTableName. This specifies the name of the query log table. The default value
for this property is OlapQueryLog.
QueryLog\ CreateQueryLogTable. This is a Boolean property that specifies whether to create the
query log table. The default value for this property is false, which indicates that the server will not
automatically create the log table and will not log query events.
You can use the query log to identify the queries that are causing issues with the Analysis Services cubes.
The query log’s benefits are fully realized when used in conjunction with the Usage-Based Optimization
Wizard that is covered in the next module.
It is prudent to first check if the connectivity problem is caused by general network connection issues. You
should ask the user if they can access other applications or network resources first. In addition, you can
run standard network connectivity checks by using tools such as PING or Traceroute. If connectivity is
confirmed by using these tools, then application network connectivity checks can be performed—for
example, this can include connecting to Analysis Services through Excel or SQL Server Management
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-13
Studio. If connectivity is still not occurring, you should check if the instance of Analysis Services is a named
instance. You should then check that the SQL Server browser service is running in SQL Server
Configuration Manager, and then ensure that the correct port number is being used to connect to the
service.
To confirm which port number that a named instance of Analysis Services is running under, you should
perform the following steps:
Open task manager and get the Process ID (PID) for msmdsrv.exe.
Open the command prompt and type netstat /abo >>c:\output.txt.
Look for the PID in the output file and the corresponding TCP IP:Port information for the
same PID.
To confirm whether you have the correct port number, open SQL Server Management Studio
and connect to Analysis Services using the IP Address:Port Number (for example,
192.168.1.1:5585).
If the default port 2283 is not used, you can update the port information in the instance properties in
Analysis Services, and then connect from the client machine, using the port to confirm if connectivity is
resolved.
Check user access and permissions
In this scenario, a user is prompted to authenticate to an Analysis Server but, when providing the
credentials, is unable to connect. The troubleshooting approach to this issue depends on whether the user
is accessing the cube directly, through an application such as Excel, or if they are accessing the service
through another application, such as Reporting Services.
By using a direct connection to Analysis Services, you can check the roles node to see if the user exists
within a role that has been granted access to the data model. If the management of the security is based
on Active Directory® groups, you might need to liaise with the team that manages the groups to confirm
the user’s membership; however, you should first confirm that the group is allowed access to the data
model. Connection to the data model through another application involves more investigation. In this
scenario, a user may connect to a Reporting Server that then connects to the Analysis Server to retrieve
the data—the user is authenticating against the Reporting Server, and then Reporting Services should be
retrieving the data from a data model on behalf of the user. This process is referred to as double hop
authentication. In this circumstance, you would have to liaise with the network team that has probably set
up the double hop authentication using delegation and impersonation in Active Directory, and the setspn
command.
Delegation is the process of giving an Active Directory account permissions to perform a task. An example
is the ability to impersonate another user account. Impersonation is the process of one account
impersonating the credential of another—for impersonation to work, delegation of this permission must
be done first. You can use the setspn command-line tool to manually register an application within Active
Directory, so that it appears as an object that can be managed. With this in mind, the Reporting Services
application can be registered within Active Directory as an object using setspn. You can then set up
delegation for the application in Active Directory.
In this scenario, user access may be revoked as the result of an incorrect setting in either the setspn
command line tool, or the incorrect setting of delegation in Active Directory. After you have confirmed
that the user can access the data model directly through a tool such as Excel, you will need to work with
the Active Directory team.
MCT USE ONLY. STUDENT USE PROHIBITED
6-14 Troubleshooting BI Solutions
Lesson 4
Troubleshooting SQL Server Reporting Services
Reporting Services is typically the most visible application that is used by users in a BI solution. The service
desk tickets that are submitted will be varied—they can range from functionality issues, such as report
parameters not working on a report as expected, to reports not rendering in a timely manner. As a result,
the BI operations team should become very familiar with this technology.
Reporting Services may provide symptoms to many of the underlying issues that have already been
discussed in this module. For example, a lack of access to an Analysis Services data model may initially be
described as a Reporting Services issue. The job of the BI operations team is to look beyond the symptoms
to find the root cause.
Lesson Objectives
After completing this lesson, you will be able to troubleshoot:
Reporting problems.
Subscription issues.
Report access.
Report performance
Users may complain about the amount of time it
takes to pull a report when browsing on a Report
Server. The BI operations team can use the
Reporting Services execution log covered in
Module 7 to determine if the issue is related to
poor performance in retrieving the data from a
data source, or whether the time is mainly spent processing the data on the Report Server—or in the
rendering of the report.
Should the issue relate to the time it takes to retrieve the data from the data source, SQL Server Profiler or
DMVs can be used to establish the performance of the query in relation to the additional workload that
will be occurring against the data source.
With the processing of data on the Report Server, Windows Reliability and Performance Monitor contains
a counter named Memory Pressure State in the Report Server Service object—you can use this to
determine if the Report Server is suffering from memory pressure. You can then include additional
counters to see the state of the memory in relation to the operating system and other BI components.
Rendering performance issues might require you to make an adjustment to a report so that it renders in a
more efficient manner. The BI operations team should also keep on top of understanding whether a
service pack or cumulative update may help to resolve a rendering issue—not only with SQL Server, but
also with the technology that is rendering the report. For example, if there is an issue with rendering an
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-15
HTML report to Internet Explorer®, you will also want to consider applying updates to the rendering
technology.
Usability issues
Usability issues can be varied, but common usability issues relate to the following areas of Reporting
Services:
Report parameters usage with snapshots
When a report is configured to use a snapshot to store data, there is an impact on the report parameters
that are linked to query parameters. In this instance, the report parameter in question will be greyed out
and deemed unusable. This is because the query parameter only returns the information that is defined
against the report parameter when a snapshot is created. It takes an image of the data at the time of
creation, and this is then used when the report is accessed. To resolve this issue, you should either not use
a snapshot, or reconfigure the report parameter to use a filter instead.
Linked reports
A simple error where the user states that a report is not behaving as expected. A linked report is an
example of a base report that has different properties configured. The user will sometimes use a linked
report that has different properties configured. When the user runs the report, they will receive different
data to what was expected. In this situation, you should politely advise the user that they are accessing the
wrong report.
Making use of the snapshot feature in Reporting Services also provides the opportunity to save historical
copies of reports. You may receive service desk tickets saying that historical reports are missing. On the
assumption that the historical reports have not been deleted, you should check the retention period that
is configured for the report. This can be done in the properties of a report in the History page—you
should also check the default system-wide retention setting in Site Settings in the web portal.
Failure sending mail The Report Server could not connect to the Email Server.
Check the email settings in Reporting Services Configuration
Manager.
Failure connecting to a The Report Server could not find the location defined in the
destination folder subscription. Check that the folder.
The file a could not be written The file a could not update or overwrite the file b. Check the
to file b subscription settings to allow new files to overwrite old files,
and then check the permissions of the folder.
Failure writing file a File a could not be written to a folder. Check the permissions
of the folder.
HTTP 400 bad request. If the error is described as: "The webpage cannot be found" or “HTTP 400
error”, the Report Server database might not be available. You can use the Reporting Services
configuration tool to verify that the database is configured. You can use the Services console
application in Administrator Tools to verify that the SQL Server Database Engine instance has started.
HTTP 503 errors. These errors can occur either during report processing or when you first access a
Report Server. This indicates that the Report Server is suffering from high memory pressure and is
refusing to accept new connections. You should review the Reporting Services configuration in the
context of the other workloads on this system to ensure there is enough memory for the Report
Server.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 6-17
Adventure Works employees are increasingly frustrated by the time it takes for the business reports to
become available on a daily basis. The existing managed BI infrastructure—including data warehouses
and enterprise data models are valued sources of decision-making information. However, users are
increasingly finding it takes too long for the data to be processed in the overnight load, resulting in
reports not arriving to business users until the early afternoon.
You are supporting the BI operations team in dealing with service desk tickets that the team thinks will
handle the root cause of the issue that Adventure Works is experiencing.
Objectives
At the end of this lab, you will be able to troubleshoot:
Password: Pa55w.rd
2. Ensure that the 10988C-MIA-DC and 10988C-MIA-SQL virtual machines are both running, and then
log on to 10988C-MIA-SQL as ADVENTUREWORKS\Student with the password Pa55w.rd.
4. Using SQL Server Management Server, manually restore the AW_SSAS Analysis Services database
from the D:\Setupfiles\AW_SSAS.abf file.
3. Rerun the EIM_Demo BI Load job, but note that the job now completes with an error, which you will
resolve in the next exercise.
Used the appropriate logging and monitoring tools to identify the issue.
Using the logging and monitoring tools that you have at your disposal, you will identify the root cause of
the problem and apply a fix to ensure that the BI load completes successfully.
The main tasks for this exercise are as follows:
Note: After adding the data, do not forget to reprocess the dimension before processing
the cube.
2. After reprocessing the dimension process the EIM Demo cube. Note that another error occurs, this
time caused by a data issue in the Customers dimension.
4. Reprocess the Customers dimension, and then process the EIM Demo cube.
Used the appropriate logging and monitoring tools to identify the issue.
Resolved the unresponsive nature of the BI solution with a permanent fix.
Question: Discuss with the group the approach that you used to identify the root cause issue
of the problem.
Question: On reflection, is there anything you would change about the approach or the
tools that were used to troubleshoot the BI solution?
MCT USE ONLY. STUDENT USE PROHIBITED
6-20 Troubleshooting BI Solutions
Module 7
Performance Tuning BI Queries
Contents:
Module Overview 7-1
Lesson 1: The Need for Performance Tuning 7-2
Module Overview
In this course, you have seen many of the operational activities that take place in an organization—they
will often lead to the provision of a long-term solution to an issue that has been occurring in a BI
environment. Sometimes, however, changes to resolve an issue that are made by the BI operations team,
such as optimizing the BI platform, may not have the desired results.
When the BI operations team are satisfied that they have exhausted all areas in attempting to resolve an
issue, they might need to work with the development team to look at tuning the query aspects of the BI
solution to improve performance. Many BI operations make extensive use of queries, and it might be
necessary to look at these queries in more depth to improve performance.
The BI operations team would also have to discuss taking advantage of BI component features to help in
performance. For example, a suggestion might be made that Reporting Services snapshots could be used
to help performance. However, the development team would need consulting to understand the impact
of using such functionality on the overall solution.
Objectives
After completing this module, you will be able to:
Lesson 1
The Need for Performance Tuning
Performance tuning is the process of making changes or improvements to a system to increase or
maintain performance. As the data loads in a BI system increase, there is typically a negative impact on
the BI system performance; therefore, modifications might need to be made to ensure performance
improvement or performance consistency. Many areas of the BI system can be modified to improve
performance but, occasionally, you will need to look into the code itself. You can seek help from the
developers, to see if changes can be made to the code, or if supporting objects can be created to meet
the performance tuning objective.
Before undertaking such an activity, it is important that the BI operations team remove any potential
obstacles to reviewing the code. A common piece of feedback that an operations team might receive is
that the performance issue has nothing to do with the code; instead, the platform on which the solution
resides is substandard. The BI operations team can remove this obstacle by optimizing the data platform
and providing the supporting evidence to the development team from the information collected in any
logging or monitoring activities—they can then confirm to the developers that the platform is optimized
in the best possible manner.
The key is to optimize all levels of the solution, and if optimization of the platform does not have the
desired results, then the BI code should be reviewed.
Lesson Objectives
After completing this lesson, you will be able to explain:
Common scenarios that make use of queries include data warehouse loads, Analysis Services queries, and
Reporting Services activities—all can have an impact on the performance of a BI solution and should be
considered. In a single server scenario, there can be a focus on optimizing the service to take full
advantage of the hardware on which it is hosted. However, when services are shared on a single server,
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-3
there must be a balance of the services across the available hardware, in addition to the times when the
tasks will execute.
Performance tuning can make use of a range of tools, including Activity Monitor, Performance Monitor,
and data collectors. This module will explore using SQL Server Profiler, Execution Plans, and Database
Engine Tuning Advisor to deal specifically with providing information about improving the queries that a
BI solution might use.
SQL Server Enterprise Edition also includes the Query Store, a database level feature that gives you an
insight on query plan choice and performance for Transact-SQL queries. Query Store automatically
captures a history of the execution plans and statistics. You can review the various plans that have been
used over time, and even force a plan to be used by SQL Server on subsequent executions of a query. This
gives the BI operations team even greater flexibility to optimize queries—it can be used in a range of BI
scenarios, such as loading a data warehouse or retrieving reports.
SQL Server Profiler is an important tool for monitoring SQL Server Analysis Services (SSAS) query
performance. Operating in the same way as profiling for Transact-SQL, it gives you the opportunity to
identify any suboptimal queries. This is relevant for queries that are issued directly against the data model,
or for third-party applications that are querying the data model directly.
Identify any bottlenecks to performance. As the BI solution grows, more demands are placed on
the solution and its resources. This can cause bottlenecks to performance through excessive use of
hardware, or an increase in the amount of locking and blocking, as the number of users increases. It is
important that you use the correct tool for identifying a performance bottleneck. For example, you
could use Task Manager, Performance Monitor or data collectors if the team suspects the issue
involves the hardware on the system. If queries are suspected to be a cause of locking and blocking, it
may be more appropriate to use SQL Profiler Activity Monitor to identify long-running queries, and
then follow up with execution plans to establish the reasons for the bottleneck.
MCT USE ONLY. STUDENT USE PROHIBITED
7-4 Performance Tuning BI Queries
Implement a change. When improving the performance of a query, it might be that indexes are
added to speed up the retrieval of the data without needing to change the underlying code.
Alternatively, the query code itself might be changed. The change should be implemented so that it
can be measured.
Measure the performance. It is important to rerun the processes with your suggested fix in place,
with a view to repeating the same monitoring process that first identified the issue. This will confirm if
an improvement has taken place. If not, the exercise can be repeated, either by trying a different fix,
or where the original fix is accepted and placed as a change in the production environment.
The BI operations team could become involved in the testing process, and so add value by helping the
development and testing team. The BI operations team can conduct performance tests on the code, and
make suggestions for improvements. This will produce the following benefits:
Whilst this activity may not be seen as an operational task, taking this proactive approach can benefit the
business, ensuring that the risks are managed, and that the need for supporting the solution is reduced.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-5
Lesson 2
BI Queries to Performance Tune
Various activities that take place in a BI solution will make extensive use of queries—these can also occur
at different times during the BI operations window. Should there be an issue with performance, it is
important to collect the information that is required to help identify the issue, whilst minimizing the
impact of using the tools to collect the information. Therefore, you should be prudent with the tools that
are used.
Lesson Objectives
In this lesson, you will learn about the queries that are required to performance tune, with regards to:
Transact-SQL queries.
Transact-SQL Queries
Various scenarios in a BI solution will use Transact-
SQL queries that you might need to optimize,
including:
ETL data loads using the SSIS Execute SQL Task.
As a result, you can run the following query that shows the top 10 cached execution plans that use the
most cumulative CPU time. It also includes information about the amount of logical and physical reads
that have been used in executing a query.
MCT USE ONLY. STUDENT USE PROHIBITED
7-6 Performance Tuning BI Queries
Using a DMV to identify queries with the highest cumulative CPU time in milliseconds
--Top 10 CPU Cumulative waits
SELECT TOP 10
[qs].[creation_time]
, [qs].[execution_count]
, [qs].[total_worker_time] as [total_cpu_time]
, [qs].[max_worker_time] as [max_cpu_time]
, [qs].[total_elapsed_time]
, [qs].[max_elapsed_time]
, [qs].[total_logical_reads]
, [qs].[max_logical_reads]
, [qs].[total_physical_reads]
, [qs].[max_physical_reads]
, [st].[text]
, [qp].[query_plan]
, [st].[dbid]
, [st].[objectid]
, [st].[encrypted]
, [qs].[plan_handle]
, [qs].[plan_generation_num]
FROM
sys.dm_exec_query_stats qs
CROSS APPLY
sys.dm_exec_sql_text(plan_handle) AS st
CROSS APPLY
sys.dm_exec_query_plan(plan_handle) AS qp
ORDER BY qs.total_worker_time DESC
It is important that the information collected from the general logging is used to inform you of the
appropriate tools or queries to use to further analyze a specific problem in a particular context. This can
help you further troubleshoot the problems with a query, using Execution Plans, Query Store or the
Database Engine Tuning Advisor.
The storage engine. This stores the data and presents it to the formula engine when a query is
requesting data. It also determines the level of data that should be returned to satisfy the query being
processed by the formula engine.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-7
This appreciation of Analysis Services will help you to better understand the objects that you use in a tool
such as SQL Server Profiler. For example, the Query Subcube Verbose object in SSAS Profiler will list all the
requests that are made from the formula engine to the storage engine. The Data from Aggregations
object will inform Profiler if the data retrieved from the storage engine and sent back to the formula
engine is from aggregated data, rather than leaf-level detailed data. The absence of rows in Profiler from
the Data from Aggregations object may indicate that the aggregation design for the cubes needs
recreating.
For tabular data models, you would want to include the VertiPaq SE Query End object, in conjunction with
the duration column, and then apply a filter where EventSubclass = "0 - VertiPaq Scan". If the duration for
these events is more than 50 percent of the total duration of a query, this indicates that the issue lies
within the storage engine.
As previously discussed, it is important to understand if the query is being executed from a cold cache or
a warm cache, as this can affect performance.
You can clear the SSAS cache to ensure there is a cold cache by running the following XMLA query in SQL
Server Management Studio connected to SSAS. The execution of a query will populate the cache, and
subsequent executions of the same query will work from a warm cache.
When analyzing SSAS queries, it is best practice to clear the cache first before starting an analysis, so that
you can observe the performance difference between a cold cache and a warm cache.
Sources
Transforms
Destinations
Transformations can often be a source of performance issues, and this can be dependent on the type of
transformations that are used. SSIS deals with the following three categories of transformation:
Non-blocking transformations. These transformations use the same buffer space in memory to
both consume the input data and output the transformed data. This has a minimal impact on the
performance of the SSIS data flow. Examples include the Data Conversion transform and the Derive
Column transform.
MCT USE ONLY. STUDENT USE PROHIBITED
7-8 Performance Tuning BI Queries
Semi-blocking transformations. These transformations use one area of buffer space in memory to
consume the input data—this will create additional buffer space for the output data. These
transformations might introduce additional CPU threads to process the data. Examples include the
Merge transform and the Merge Join transform.
Blocking transformations. These transformations place the heaviest performance burden on the
SSIS subsystem. These transformations use one area of buffer space in memory to consume the input
data, and this will create additional buffer space for the output data. They introduce additional CPU
threads to process the data. Examples include the Sort transform and the Aggregate transform.
SSIS logging will help to identify any transformations that are causing performance issues. In the case of
blocking transformation, it is more prudent to remove these transforms from the SSIS packages and
replace them with Transact-SQL equivalents. For example, a Sort transformation could be replaced by an
ORDER BY clause in a Transact-SQL statement when retrieving data from a data source.
For data movements that use the Execute SQL task, you should follow the steps for performance tuning
Transact-SQL queries. This includes using general logging and SSIS logging to identify the impact of
specific Execute SQL tasks, and then perform the targeted analysis with the appropriate query tuning tool.
In addition, you can advise the developer team to configure the package execution settings to optimize
the package for the production environment. Some properties to configure can include:
RunInOptimized. A data flow property that improves performance by removing unused columns,
outputs, and components from the data flow.
DefaultBufferSize. This determines the amount of memory that the package can use. The default is
10 MB but it can be set up to 100 MB.
DefaultBufferMaxRows. This determines the number of rows that can be held in the buffer, within
the DefaultBufferSize limit—the default value is 10,000.
EngineThreads. This sets the number of threads that the task can use during execution.
Report Generation
SQL Server Reporting Services reports can make use
of Transact-SQL, MDX or DAX queries to retrieve
data from a range of data sources. Reports are
manually pulled when the user browses the web
portal and clicks on a report. They can also be
scheduled to execute to facilitate the subscriptions
that are set up for users, or for the generation of
caches or snapshots.
Beyond this, you should consider using the Execution Logs in Reporting Services to determine which
queries are taking the longest to generate. You should compare this with the general logging and
monitoring to establish which part of the SQL Server’s subsystem is being impacted. You can then
determine the appropriate query tuning tool to use to provide further analysis.
MCT USE ONLY. STUDENT USE PROHIBITED
7-10 Performance Tuning BI Queries
Lesson 3
Tools for Performance Tuning
A wide variety of tools can be used to performance tune queries. Some of these tools might even provide
recommendations on how a given query can be improved. The tools that can be used to resolve common
SSIS, SSAS, and SSRS issues are outlined here. You may choose to use only one tool but it is likely that you
will use more of them to provide evidence for how you will apply a fix.
Lesson Objectives
In this lesson, you will see how to use:
Execution Plans.
Query Store.
Execution Plans
The performance of queries in a BI solution may
sometimes be substandard and require analysis. To
that end, SQL Server provides Execution Plans so
that the user can see how a query is being
executed. It may also make suggestions on which
indexes can be created to improve the performance
of a query.
Interpreting execution plans accurately involves understanding the setup of a server, understanding the
server’s workload, and understanding SQL Server’s internal table and data structures. When armed with
this information, there can be a substantial performance improvement in responding to a query execution
plan and modifying a query; or creating an index to improve the performance.
You can access Execution Plans when a query window is open in SQL Server Management Studio—you
can select Query on the menu bar, and then click Include Actual Execution Plan. Alternatively, you can
select Display Estimated Execution Plan without running the query.
The critical aspect of using Execution Plans is to look for the existence of a scan in the query operations. A
scan is a query operation that involves looking through the entire contents of a table or an index. In some
cases, this may be appropriate, but usually it is more efficient to perform index seeks. Adding indexes to a
table will improve query performance for targeted searches. The following table shows the scan and seek
icons to look for in the execution plan:
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-11
Icon Operator
Table Scan
A figure that shows a percentage of the relative cost of the operation, in the context of the whole query,
will appear under the icon—you should look for the icon with the highest percentage. Further information
can be found on any icon by hovering over the icon and displaying a tooltip. This returns the following
additional information:
Physical Operation The physical operator used, such as Hash Join or Nested Loops.
Physical operators displayed in red indicate that the query
optimizer has issued a warning, such as missing column
statistics or missing join predicates. This can cause the query
optimizer to choose a less efficient query plan than otherwise
expected.
When the graphical execution plan suggests creating or
updating statistics, or creating an index, the missing column
statistics and indexes can be immediately created or updated
using the shortcut menus in SQL Server Management Studio
Object Explorer.
Logical Operation The logical operator that matches the physical operator, such
as the Inner Join operator. The logical operator is listed after
the physical operator at the top of the tooltip.
Estimated Row Size The estimated size of the row produced by the operator
(bytes).
Estimated I/O Cost The estimated cost of all I/O activity for the operation. This
value should be as low as possible.
Estimated CPU Cost The estimated cost of all CPU activity for the operation.
Estimated Operator Cost The cost to the query optimizer for executing this operation.
The cost of this operation as a percentage of the total cost of
the query is displayed in parentheses. Because the query
engine selects the most efficient operation to perform the
query or execute the statement, this value should be as low as
possible.
Estimated Subtree Cost The total cost to the query optimizer for executing this
operation and all operations preceding it in the same subtree.
Estimated Number of Rows The number of rows produced by the operator. This tooltip
item displays as Number of Rows in an Actual Execution Plan.
MCT USE ONLY. STUDENT USE PROHIBITED
7-12 Performance Tuning BI Queries
There are many other icons, such as Sort and Delete, that are returned by execution plans. Looking for
scans and seeks would be the starting point for examining the execution plans of queries.
Query Store
The SQL Server Query Store can provide insights
into current and historical use of the query plans
that are stored in the buffer memory on a SQL
Server instance. This capability allows you to find
and fix an execution plan performance by forcing a
previous query plan that worked better.
A Query Store node will appear under the database that has been enabled. Within this is a node named
Regressed Queries that will show a record of multiple execution plans for the queries that execute against
a database displayed as execution plans. In the Regressed Queries window, you can view the queries and
the associated execution plans for the query. You can also order the queries in the list based on various
criteria, such as CPU time, logical reads, and physical reads. The Query Store makes it much easier to
correlate execution plans against a set of criteria.
You can also use a feature known as Plan Forcing to enable the query optimizer to use a specific
execution plan for a given query. To force a plan, select a query and plan in the Regressed Queries
window, and then click Force Plan. You can only force plans that were saved by the query plan feature
and are still retained in the query plan cache.
A number of additional options can be defined when enabling the Query Store; SQL Server provides
DMVs to give information about the state of the Query Store.
Database Engine Tuning Advisor tool can be a useful resource in making appropriate index tuning
recommendations.
The Database Engine Tuning Advisor is a separate tool that is found in the Windows® Start menu. First,
you must authenticate to an instance of SQL Server, and define a name for the work. Next, you specify the
workload file or table. There is also the option to use the query cache as a source of the workload file. This
is useful in scenarios where the server has been running for a long period of time.
You should then select the database and/or individual tables to tune. There are a range of tuning options
to use, such as tuning for Physical Design Structure. You can determine which partitioning strategy to
apply, and there are also advanced options that help you to restrict the amount of space used for the
recommendations, and whether or not to allow online index recommendations. When the options are set,
you can start the analysis.
On completion of the analysis, two additional tabs are presented. The Recommendations tab occurs with
an estimated improvement, where you have a breakdown of partition recommendations, if specified, and
index recommendations. This tab provides information including the database name, the object name,
and the recommendation. The Reports tab has a summary of the tuning process that has taken place.
There are also Tuning reports where you can select a specific report, such as an Index Usage report, that
will give you a list of indexes that are used, and the number of references to it, based on the workload file
that has been provided.
In the Recommendations tab, you can click a recommendation, and deselect all recommendations or
individual recommendations. You can then go to Action and Apply or save or evaluate the
recommendations.
The Usage-Based Optimization Wizard can be accessed in the Aggregations tab of the Cube designer
within Visual Studio. You can also access the wizard within SQL Server Management Studio by right-
clicking a partition within Object Explorer. The wizard asks you to perform the following steps:
Select the partitions to modify. In the Usage-Based Optimization Wizard, this screen will allow you
to choose whether to apply the wizard to the entire cube or to specific partitions.
MCT USE ONLY. STUDENT USE PROHIBITED
7-14 Performance Tuning BI Queries
Specify query criteria. In the Usage-Based Optimization Wizard, this screen allows you to select
criteria for the queries that you want to optimize. Queries can be selected based on date, user or
frequency.
Review the queries that will be optimized. In the Usage-Based Optimization Wizard, this screen
allows you to select the specific queries that will be optimized from the queries returned by the
options defined within the Specify Query Criteria screen.
Specify object counts. In the Usage-Based Optimization Wizard, this screen allows you to perform a
count of the objects within the cube and the dimensions that will provide a realistic level of data that
the data model will store.
Set Aggregation Options. In the Usage-Based Optimization Wizard, this screen allows you to choose
the aggregation storage options, based on hard disk limits or performance limits.
Complete the Wizard. In the Usage-Based Optimization Wizard, this screen allows you to specify
how the partitions are created and deployed before finishing the wizard.
The more query logging data that can be collected, the more valuable the Usage-Based Optimization
Wizard will be in designing effective aggregations. As a result, you might want to run Query Logging for a
period of time before using the data to optimize the aggregations.
If the information for the TimeDataRetrieval column is a high value, this should prompt you to focus on
optimizing the query that returns data to the report using features such as Execution Plans or Query Store.
If the TimeProcessing value is high, this might indicate that there is pressure on the Reporting Services
system. Performance Monitor counters for the operating system and Reporting Services can be used to
confirm if this is the case—particularly the Memory Threshold state found in Reporting Services object, or
the % Processor time in the Processor object. Viewing the Disk Queue length for the drives on which SSRS
is stored can also provide information about the disk subsystem.
A high TimeRendering value should prompt you to look at how the report has been constructed. For
example, if there is an external image used within the report that is stored on a network share, it may take
time to retrieve the image from the network share; therefore, embedding the report within the report
itself will speed up the rendering process.
You can also join the contents of the view with other views and tables within the ReportServer database.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-15
For example, the following query joins the ExecutionLog view with the Catalog table to return the name
of the report item and how it has performed at any given time:
Lesson 4
Remediating Performance Issues
A number of techniques can be used to remediate performance issues that occur with a BI solution.
Before embarking on remediation, you should ensure that the data platform is stable, and that any
existing errors are resolved before performing changes to improve the queries. Using these techniques
effectively requires a deep understanding of the data model—the BI operations team should liaise with
the development team to ensure that any proposed changes are in line with the data model that has been
developed. Not all of the techniques outlined will necessarily solve an issue, and it is important that
changes are tested before they are deployed into production.
Lesson Objectives
At the end of this lesson, you will remediate performance issues by using:
Indexing.
Analysis Services partitioning.
Refactoring queries.
Using Indexes
Indexes are SQL Server objects that can improve the
performance of retrieving data that uses Transact-
SQL queries. There is a cost associated with indexes
because they consume disk space; however, if
indexes are used in a pragmatic way, the benefits
they bring outweigh this cost. In addition, indexes
can slow down insert and update operations that
occur on the database. These operations are
predictable in a data warehouse and the indexing
can be managed so that there are no indexes when
data is being loaded or updated in a data
warehouse. Indexes can then be placed on the
relevant tables when the load is completed, ready for other BI components to use.
SQL Server stores data in an 8 KB page, which is grouped into 64 KB extents. A single table may be spread
over many extents across the disk in a data structure called a heap. For SQL Server to retrieve this data, it
must identify the extents that belong to a table.
Indexes bring order to the data and come in two forms. In a traditional indexing structure there are
clustered indexes and nonclustered indexes. You can create only a single clustered index against one or
more columns of a table or a view. When this is done, the actual data that is stored in a heap is physically
and logically reordered into a contiguous space on the disk. This makes the retrieval of the data quicker,
especially when a query is based on a column that is part of the index.
Nonclustered indexes are typically created to support the queries that are used to retrieve data from a
table or a view. There can be up to 1,000 nonclustered indexes per table. Nonclustered indexes do not
physically reorder the data; instead, they create an index of pointers to where the data is stored on the
disk. Queries that look for matching values or return a small range of values will perform better with this
type of indexing.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-17
Typically, clustered indexes are created on key columns to organize the data, with nonclustered indexes
created to support the queries that are issued against the data warehouse.
Columnstore indexes are ideal for large data stores that are typically associated with data warehouses—
especially those that store fact table data. The key benefit of columnstore indexing is that it can compress
the data to a high ratio, providing much higher performance than that previously achieved with
traditional indexes. With SQL Server, the capability has been enhanced to include columnstore indexes for
use with real-time analytics on operational workloads.
In the context of data warehousing, when columnstore indexing is used with an aligned table partitioning
strategy, the retrieval of data is much faster, particularly with full table scans. In SQL Server, you can also
use one nonclustered index that helps when performing targeted or range searches against tables that
have a defined columnstore index. As a result, a columnstore index would be created against an entire
table to take advantage of the compression and performance, with a nonclustered index created on a
column that has either queries in range or targeted queries.
Furthermore, in multidimensional mode, each partition can be configured with its own storage mode. For
example, you may create a partition that stores the current year’s data in multidimensional online
analytical processing (MOLAP) storage mode, and stores the historical data in a separate partition in
hybrid online analytical processing (HOLAP) storage mode. Further performance gains can be achieved if
the Analysis Services cube partitioning is aligned to the table partitioning strategy that is created in data
warehouse tables. By default, a partition is created for each measures group within the cube. You should
right-click the partition and click Delete.
To create a new partition, click the partition node, and then click New Partition. Click Next and define
the source information, which would typically be a fact table. When you click the next screen you can then
define a query that includes a WHERE clause to restrict the data that will be stored in a partition. For
example, you could use a join between the fact table and a DimDate table on the orderdate key and the
datekey; then, in the WHERE clause, the FulldateAlternatekey is set to a date range that restricts the data
by a date range.
You can specify the processing location of the partition, which is either the current server instance or a
remote instance of Analysis Services. You can then define the storage location, which can be the default
server location or a location of choice. Click Next and define a name for the partition. You can design the
aggregations for the partition now, design the aggregation later, or copy the aggregation from an
existing partition before finishing off the wizard.
Caching
To improve the performance of retrieving the report data, you can cache a temporary copy of the report.
With this setting defined, when a report is first run, it is retrieved from the data source and stored in the
cache within the ReportServerTempDB database. Subsequent execution of the same report retrieves the
report data from the cache. It is important to note that, on a server reboot or a service restart, the
contents of the ReportServerTempDB database is emptied.
Snapshot
An alternative approach is to create snapshots. Snapshots can be created based on a schedule in advance
of the user browsing the report. The report snapshot is stored within the ReportServer database and the
user will browse the report snapshot stored within the ReportServer. Snapshots can also be used as the
basis to store historical copies of the report for future reference.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-19
If a report contains parameters, the parameter value defined determines the data that is returned to the
cache. When parameters are used with a snapshot, the parameter value cannot be changed. However,
filters return all of the data to the report server cache. Furthermore, if a snapshot is defined on a report
with a filter, the report parameter that uses the filter can have its value changed.
You can configure the caching and reporting options by using the Manage option.
Refactoring Queries
Transact-SQL is a declarative language—this means
that a query can be written in a number of different
ways to return the same results. However, queries
might perform at different speeds. As a result, it
may sometimes be more pragmatic to rewrite a
query to be more efficient.
The WHERE and the SELECT clause will then filter the data from the source to produce a subset of
data.
The same steps are used in an aggregate query, but then SQL Server will evaluate the columns in the
GROUP BY clause to group the data together. If a HAVING clause is used in conjunction with the GROUP
BY clause, a filter is applied to the grouped data. If an ORDER BY clause is in the query, it will then sort the
results. You can improve the performance of the query by including columns in the index that are in the
query. This can have a noticeable impact when including columns that are defined in a WHERE clause.
This can be especially useful when working with a GROUP BY clause, as the WHERE clause can be used to
filter the set of data before the GROUP BY clause is applied. If you use the HAVING clause to filter the
data, using an index is unlikely because an aggregation operation is being performed.
However, if this does not work, there are a number of guiding principles to consider should a query need
rewriting from a performance point of view:
Use set-based logic rather than cursor logic. The query optimization process in SQL Server is
optimized to interpret and optimize set-based operations. Procedural or cursor-based logic cannot
take full advantage of the capability that the query optimizer has to offer, so it will perform at a
suboptimal level.
Avoid using query hints. If you are experiencing performance issues with a query, look out for query
hints that may be explicitly defined in a query. Query hints direct how a query should retrieve data
and ignore the suggestions made by the query optimization process. It is worth baselining the
performance of a query with the query hint in place—you can then comment out the hint and rerun
the query to establish the delta in performance. There are limited scenarios where white papers might
advise the use of query hints; however, subsequent cumulative updates may negate the need to use
them if there is a fix in the update.
MCT USE ONLY. STUDENT USE PROHIBITED
7-20 Performance Tuning BI Queries
Rewrite SQL queries to remove correlated subqueries. Correlated subqueries evaluate the data
that exists in multiple tables on a row-by-row basis and therefore impact the performance of the
query. The EXIST clause may introduce an improvement, but it is better to try to rewrite correlated
subqueries as a JOIN query, to take full advantage of the query optimization process.
Avoid using scalar use defined functions in the WHERE clause. Avoid using scalar use defined
functions in the WHERE clause because they are not optimized as part of a query plan and are
evaluated on a row-by-row basis.
Rewrite SQL to simplify a query using CTEs or Temp tables. Try to break up long and complex
queries into smaller units of work by using Common Table Expressions or temporary tables. This will
reduce the IO required for the query optimizer to find a good plan against a subset of data that can
be joined later to produce a final result set.
These guiding principles provide a starting point for refactoring a query. However, you should always be
guided by the evidence that is presented in the monitoring tools, such as Execution Plans or the Query
Store.
MCT USE ONLY. STUDENT USE PROHIBITED
Managing SQL Business Intelligence Operations 7-21
You are a consultant working with the BI operations team to improve the operational management of
their current BI solution. You have recently been working with the operations team to remediate issues
that have occurred with the BI solution. This has been resolved and now the focus of the team is to look
for improvements in the extracts of source data within the BI solution.
Objectives
At the end of this lab, you will be able to:
Password: Pa55w.rd
4. Execute the query with the Include Actual Execution Plan option set when running the query.
MCT USE ONLY. STUDENT USE PROHIBITED
7-22 Performance Tuning BI Queries
6. Execute the query with the Include Actual Execution Plan option set when running the query.
8. Execute the query with the Include Actual Execution Plan option set when running the query.
2. Use Queries.docx in the D:\labfiles\Lab07\Starter folder as a framework to identify the issues with
the queries based in the execution plans.
Question: How often do you get the opportunity to review the production queries that are
working on your systems?
Question: Will you use the Query Store feature? What benefits do you see it bringing to
your organization?
MCT USE ONLY. STUDENT USE PROHIBITED
7-24 Performance Tuning BI Queries
Course Evaluation
Your evaluation of this course will help Microsoft understand the quality of your learning experience.
Please work with your training provider to access the course evaluation form.
Microsoft will keep your answers to this survey private and confidential and will use your responses to
improve your future learning experience. Your open and honest feedback is valuable and appreciated.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L1-1
5. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
2. Discuss the interviews and identify roles, responsibilities and potential employees.
3. In File Explorer, open Roles.docx in the D:\Labfiles\Lab01\Starter folder using WordPad.
4. Based on the available information, review the roles and assess the responsibilities required, and then
decide which employees should work in those roles in Roles.docx.
Results: At the end of this exercise, you should have created a table that shows the roles required, with a
named employee who has key responsibilities.
2. In the New Team Project on mia-sql\AdventureWorks BISolutions dialog box, on the Specify the
Team Project Settings page, in the What is the name of the team project? box, type Adventure
Works, and then click Next.
3. On the Select a Process Template page, in the Which process template should be used to create
the team project? list, click Scrum, and then click Next.
4. On the Specify Source Control Settings page, in the Choose a version control system for the
new project list, ensure Team Foundation Version Control is selected, and then click Next.
2. In the New Project dialog box, in the Templates list, click Business Intelligence, and then click
Integration Services Project.
3. In the Name box, type AWMigration, and then click Browse.
4. In the Project Location dialog box, browse to the D:\Labfiles\Lab01\Starter folder, and then click
Select Folder.
5. Ensure the Add to source control check box is checked.
6. In the Solution name box, type AWMig, and then click OK.
7. In the Add Solution AWMig to Source Control dialog box, review the settings, and then click OK.
8. In Solution Explorer, under SSIS Packages, right-click Package.dtsx, and then click Rename.
10. Right-click the AWMig solution, and then click Check In.
11. In the Microsoft Visual Studio dialog box, click Yes to save all changes.
12. In the Team Explorer - Pending Changes pane, in the Comment box, type Control package for data
loads, and then click Check In.
14. In the Team Explorer - Pending Changes pane, verify that a message stating that the changeset
committed successfully appears.
15. In the Team Explorer - Pending Changes pane, click the Home icon.
17. In the Source Control Explorer tab, in the Folders section, click Adventure Works.
18. In the details section, double-click AWMig.
19. Confirm that a solution file appears in the details window with the name AWMig.sln.
20. Close Visual Studio 2017 and save all changes if prompted.
MCT USE ONLY. STUDENT USE PROHIBITED
L1-3
Results: At the end of the exercise, you will have configured Team Explorer to connect to a TFS server
named mia-sql. You will have created a project collection and stored an Integration Services project
within the project collection in the TFS server. You will have made a change to an object and checked the
object back in to TFS. Finally, you will view the changes in Source Control Explorer.
MCT USE ONLY. STUDENT USE PROHIBITED
MCT USE ONLY. STUDENT USE PROHIBITED
L2-1
4. Fill in the area of impact and the recommended change. Aim for two recommendations per
application section.
5. Discuss the findings with the instructor and the class.
6. Close WordPad.
Results: At the end of this exercise, you should have created a table that shows which areas of the data
platform should be standardized, including:
4. In the System window, under Control Panel Home, click Advanced system settings.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-2 Managing SQL Business Intelligence Operations
5. In the System Properties dialog box, on the Advanced tab, under Performance, click Settings.
6. In the Performance Options dialog box, on the Visual Effects tab, ensure Adjust for best
performance is selected.
7. On the Advanced tab, under Processor scheduling, ensure Background services is selected, and
then click OK.
2. In the Local Group Policy Editor window, expand Windows Settings, expand Security Settings,
expand Local Policies, and then click User Rights Assignment.
5. In the Select Users, Computers, Service Accounts, or Groups dialog box, type ServiceAcct, click
Check Names, and then click OK.
6. In the Lock pages in memory Properties dialog box, click OK.
2. In the Perform volume maintenance tasks Properties dialog box, click Add User or Group.
3. In the Select Users, Computers, Service Accounts, or Groups dialog box, type ServiceAcct, click
Check Names, and then click OK.
4. In the Perform volume maintenance tasks Properties dialog box, click OK.
6. For this change to take effect, log out, and then log back in as ADVENTUREWORKS\Student with
the password Pa55w.rd.
2. In the Connect to Server dialog box, connect to the MIA-SQL instance of the SQL Server database
engine by using Windows authentication.
2. In the Database Properties - tempdb dialog box, under Select a page, click Files.
3. Note the number of data files present and the location of the data files, and then click Cancel.
4. On the toolbar, click New Query.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-4 Managing SQL Business Intelligence Operations
6. In the Microsoft SQL Server Management Studio dialog box, click Yes.
7. In the Microsoft SQL Server Management Studio dialog box, click Yes.
10. In the Database Properties - tempdb dialog box, under Select a page, click Files.
11. Note the number of data files present and the location of the data files, and then click Cancel.
13. In the Server Properties - MIA-SQL dialog box, under Select a page, click Advanced.
14. Under Miscellaneous, ensure that Optimize for Ad hoc Workloads is set to True.
16. Ensure that the Minimum server memory (in MB) is 2048, and the Maximum server memory (in
MB) is 4096, and then click Cancel.
17. Close SQL Server Management Studio, without saving any changes.
MCT USE ONLY. STUDENT USE PROHIBITED
L2-5
<WorkingSetMaximum>3000000</WorkingSetMaximum>
<WorkingSetMinimum>2400000</WorkingSetMinimum>
Task 2: Set the SQL Server Authentication Mode on MIA-SQL to SQL Server and
Windows Authentication
1. On the taskbar, click Microsoft SQL Server Management Studio.
2. To connect to the MIA-SQL SQL Server instance, in the Connect to Server dialog box, ensure that
the Server type is Database Engine, Authentication is Windows Authentication, and then in the
Server name list select MIA-SQL.
3. Click Connect.
7. If a Microsoft SQL Server Management Studio message box pops up, click OK.
In the Microsoft SQL Server Management Studio message box, click Yes twice.
8. In the Locations dialog box, click Entire Directory, and then click OK.
12. In the Login – New dialog box, ensure that the Windows Authentication option is selected, and
then in the Default database list, select AdventureWorks.
2. In the Login – New dialog box, in the Login Name text box, type SalesApp.
7. Click OK.
3. In the Database User - New dialog box, In the User type drop-down, select SQL User with Login.
5. Select the option Login Name and click the ellipsis (…) button.
7. In the Browse for Objects dialog box, select the AdventureWorks\DL_ReadSalesData check box,
and then click OK.
11. To grant Select permission for the EIM_SalesReaders user over the EDW schema, in Object Explorer,
expand Databases, expand EIM_Demo, expand Security, and then expand Schemas.
13. In the Schema Properties - EDW window, under the Select a page pane, click Permissions.
16. In the Browse for Objects dialog box, click the EIM_SalesReaders check box, and then click OK.
18. In the SchemaProperties - EDW window, under Permissions for EIM_SalesReaders, click the Select
check box in the Grant column.
4. In the Connect to a Project dialog box, expand mia-sql, and then click
AdventureWorksBISolutions
5. Click Connect.
6. In the Team Explorer - Home pane, right-click AW_BI.sln, then click Open.
3. In the Properties pane, change the File Name to DB Process Role.role, and then press Enter. When
prompted, click Yes to change the object name.
4. On the General page of the Role Designer, select the Process Database and Read Definition check
boxes.
5. Click the Membership tab, and then in the Specify the users and groups for this role area, click
Add.
8. In the Enter the object names to select box, type GOBrien, click Check Names, and then click OK.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-4 Managing SQL Business Intelligence Operations
10. In Solution Explorer, right-click the AW_SSAS solution, and then click Deploy. If prompted to
overwrite the existing database, click Yes.
11. When the deployment has completed successfully, close the Deployment Progress window
6. In the Server Name drop-down list, type MIA-SQL, and then click Connect.
7. In Object Explorer, expand Databases, right-click the AW_SSAS database, and then click Process.
8. In the Object list, click Process Default in the Process Options list, and then click OK.
10. In Object Explorer, expand the AW_SSAS database, right-click Roles, and then click New Role.
11. In the Create Role window, in the Role name box, type TestRole, and then click OK.
12. Click OK in the error message stating that this user does not have permissions to create new objects.
13. In the Create Role window, click Cancel to close the window.
Created a database role and added a database user within the role.
4. In the Connect to Server dialog box, in the Server type list, ensure that Reporting Services is
selected.
6. In the Authentication list, ensure that Windows Authentication is selected, and then click Connect.
7. In the navigation pane on the left side, expand the Security folder.
9. In the New System Role dialog box, type SecurityAdmin in the Name text box.
MCT USE ONLY. STUDENT USE PROHIBITED
L3-5
10. In the Description box, type Can set item and site security.
11. Select only the Manage report server security and Manage roles check boxes.
15. On the top menu, click the cog (settings), then click Site settings.
6. On the Windows Start menu, click the user icon, and then click Sign out.
3. Click Connect.
4. In Object Explorer, expand the Databases node, expand EIM_Demo, expand Security, and expand
Users.
8. In the Extract Data-tier Application window, on the Introduction page, click Next.
9. In the Set Properties page, type EIM_Demo_Test for the application name, and type 1.1.0.0 under
version.
10. Add the description DACPAC of the EIM database, and then under Save to DAC package file,
browse to the D:\Labfiles\Lab04\Starter folder, click Save and then click Next.
12. On the Build Package page, when the build is complete, click Finish.
2. In the Connect to Server dialog box, ensure that the Server type is Database Engine,
Authentication is Windows Authentication, and then in the Server name list, select MIA-
SQL\SQL2.
3. Click Connect.
4. In Object Explorer, under MIA-SQL\SQL2, right-click the Databases node, and click Deploy Data-
tier Application.
5. In the Deploy Data-tier Application window, on the Introduction page, click Next.
9. On the Deploy DAC page, when the build is complete, click Finish.
10. In Object Explorer, under MIA-SQL\SQL2, right-click the Databases node and click Refresh.
11. Expand the databases node and verify that the EIM_Demo_Test database appears.
12. Close SQL Server Management Studio.
2. In Visual Studio 2017, click the Team Explorer tab and then click the Home icon.
8. In the Internet Explore notification bar, click Save, and then click Save As
10. On the Windows desktop, click File Explorer and browse to D:\Labfiles\Lab04\Starter.
11. Right-click the agent zip file and click Extract All.
12. In the extract compressed (Zipped) Folders dialog box, extract all files to the
D:\Labfiles\Lab04\Starter\agent folder.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-3
13. In File Explorer, move to the D:\Labfiles\Mod04\Starter\agent folder, right-click config.cmd and
then click Run As Administrator.
15. At the Enter Server URL prompt, type the following text, and then press Enter:
http://mia-sql:8080/tfs
16. At the Enter authentication type prompt, press Enter to accept the default value of Integrated
authentication.
17. At the line Enter agent pool prompt, press Enter to accept the default value.
18. At the Enter agent name prompt, accept the default value (MIA-SQL), and press Enter.
19. At the Enter work folder prompt, type D:\Labfiles\Lab04\Starter\agent\_work, and then press
Enter.
20. At the Enter run agent as service? prompt, type Y, and then press Enter.
21. At the Enter the user account to use for the service prompt, type AdventureWorks\ServiceAcct,
and then press Enter.
22. At the Enter Password for user account AdventureWorks\ServiceAcct prompt, type Pa55w.rd,
and then press Enter.
4. On the Tasks page, in the Name box, enter AW_BI Build definition
5. In the Agent queue drop-down list box, select Default.
7. Under From, click This project, and ensure that the Repository is set to $/Adventure Works ETL
9. In the Add tasks pane, scroll down, click Command Line, and then click Add.
14. In the Queue build for AW_BI Build definition window, click Queue
15. Verify that the status message Build #1 has been queued appears in the status bar near the top of
the page, and click the #1 link in this message.
16. Monitor the console output as the solution is built, and verify that the build completes without any
errors.
17. Close Internet Explorer.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-4 Managing SQL Business Intelligence Operations
3. Click Connect.
4. In Object Explorer, under MIA-SQL, expand the Databases node, and right-click EIM_Demo, and
then click Delete.
5. In the Delete Object dialog box, select the Close existing connections check box, and then click
OK.
6. In Object Explorer, under MIA-SQL, right-click the Databases node, and click Deploy Data-tier
Application.
7. In the Deploy Data-tier Application dialog box, on the Introduction page, click Next.
10. On the Update Configuration page, under Name type EIM_Demo, and then click Save Script.
11. In the Save Post-Deployment Script dialog box, click Documents, and then click Save.
14. On the Deploy DAC page, when the build is complete, click Finish.
15. In Object Explorer, under MIA-SQL, right-click the Databases node and click Refresh. The
EIM_Demo database will appear.
16. In Object Explorer, expand the EIM_Demo database, and then expand Tables.
17. Right-click Landing.IncomingAgentsSourceA, and then click Select Top 1000 rows. Confirm that
results appear in the query window.
18. Right-click Landing.IncomingAgentsSourceB, and then click Select Top 1000 rows. Confirm that
results appear in the query window.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-5
5. On the Specify Options for Partitions and Roles page, accept the default options, and then click
Next.
6. On the Specify Configuration Properties page, click Next.
8. On the Confirm Deployment page, select the Create deployment script check box, and then click
Next (overwrite the existing deployment script if prompted).
9. On the Deploying database page, wait for the deployment script to be completed, and then click
Next.
10. On the Deployment Complete page, click Finish.
12. To connect to the MIA-SQL Analysis Services instance, in the Connect to Server dialog box, ensure
that the Server type is Analysis Services, and Authentication is Windows Authentication. In the
Server name list, select MIA-SQL, and then click Connect.
13. On the File menu, point to Open, and then click File.
14. Browse to the folder D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI\AW_SSAS\ and double-
click AW_SSAS Script.xmla.
2. In the Run dialog box, type dcomcnfg, and then press Enter.
3. In the Component Services dialog box, in the left pane, expand Component Services, expand
Computers, expand My Computer, and then expand DCOM Config.
4. Right-click Microsoft SQL Server Integration Services 14.0, and then click Properties.
5. In the Microsoft SQL Server Integration Services 14.0 Properties dialog box, on the Security tab,
in the Launch and Activation Permissions section, click Edit.
MCT USE ONLY. STUDENT USE PROHIBITED
L4-6 Managing SQL Business Intelligence Operations
7. In the Select Users, Computers, Service Accounts, or Groups dialog box, in the Enter the object
names to select box, type Student, click Check Names, and then click OK.
8. In the Launch and Activation Permissions dialog box, ensure that Local Launch, Remote Launch,
Local Activation, and Remote Activation are all selected for Allow, and then click OK.
9. In the Microsoft SQL Server Integration Services 14.0 Properties dialog box, in the Access
Permissions section, click Edit.
13. In the Microsoft SQL Server Integration Services 14.0 Properties dialog box, click OK.
15. On the Start page, type SQL Server 2017 Configuration Manager, and then click SQL Server 2017
Configuration Manager.
18. In the right pane, right-click SQL Server Integration Services 14.0, and then click Restart.
19. Wait for SQL Server Integration Services to restart, and then close SQL Server Configuration Manager.
dtutil /FILE
D:\Labfiles\Lab04\Starter\agent\_work\1\s\AW_BI\AW_SSIS\EIM_Demo_DW_Load.dtsx
/DestServer MIA-SQL /COPY SQL;EIM_Demo_DW_Load
8. A command prompt window will open; at the Are you sure you want to overwrite it? prompt, type
Y, and then press Enter. The command prompt will close when the command has completed.
10. In the Connect to Server dialog box, ensure that the Server type is Integration Services,
Authentication is Windows Authentication, and then in the Server name list, select MIA-SQL,
then click Connect.
11. In Object Explorer, expand Stored Packages, and then expand MSDB.
4. In Solution Explorer, right-click Solution AW_BI (3 projects), and then click Deploy Solution.
5. If the Microsoft Visual Studio dialog box appears, click Yes.
6. In the Deployment Progress - AW_SSAS dialog box, when the deployment has completed, click
Close.
Manually deployed a DACPAC that has been part of a Team Foundation Server Build.
3. In the Log Properties – Application (Type: Administrative) dialog box, next to Maximum Log Size
(KB), type 50000.
4. Select the check box next to Archive the log when full, do not overwrite events and click Apply.
5. In the Event Viewer dialog box, read the message and then click OK.
3. In Object Explorer, expand the Management node, right-click SQL Server Logs, and then click
Configure.
4. In the Configure SQL Server error logs dialog box, on the General page, select the check box next
to Limit the number of error logs before they are recycled.
5. Next to Maximum number of error log files, type 14, and then click OK.
2. To view the list of data collector sets, in the Performance Monitor window, on the left pane, click
Data Collector Sets.
3. To create a new data collector set, expand the Data Collector Sets node, right-click User Defined,
point to New, and then click Data Collector Set.
MCT USE ONLY. STUDENT USE PROHIBITED
L5-2 Managing SQL Business Intelligence Operations
4. In the Create New Data Collector Set wizard, on the How would you like to create this new data
collector set? page, in the Name box, type SQL BI Monitoring.
5. Select the Create manually (Advanced) option and then click Next.
6. On the What type of data do you want to include? page, select the Performance counter check
box, and then click Next.
7. On the Which performance counters would you like to log? page, click Add.
8. In the dialog box, in the Available counters section, expand the Processor node, scroll down, click
%Processor Time, and then click Add.
9. Repeat step 8 to add the following counters:
Memory Pages/sec
12. On the Where would you like the data to be saved? page, click Next.
13. On the Create the data collector set? page, ensure that Save and close is selected, and then click
Finish.
14. IN Performance Monitor, right-click SQL BI Monitoring, and then click Start. Verify that the
collector starts running.
2. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.
3. In Object Explorer, expand the Integration Services Catalogs node, right-click SSISDB, and click
Customized Logging Level.
5. In the Create Customized Logging Level dialog box, under Name, type Errors and Warnings, then
click OK.
6. In the Customized Logging Level Management dialog box, with Errors and Warnings highlighted,
under Configuration, click the Statistics tab.
8. Click the Events tab, select the check box next to OnWarning, and select the check box next to
OnError.
3. In the New Job dialog box, in the name text box, type EIM_Demo BI Load.
4. Under select a page, click Steps, and then click New.
6. In the Type drop-down list, select SQL Server Integration Services Package.
7. On the Package tab, in the Package source drop-down list, select SQL Server.
13. Verify that the job starts successfully (don't wait for it to complete; it will take a long time), and then
click Close.
A data collector.
2. On the menu bar, click File, point to Templates, and then click New Template.
3. In the Trace Template Properties dialog box, next to Select server type, click the drop-down, and
select Microsoft SQL Server “2017”.
Category Event
6. Click Save.
2. In the Trace Template Properties dialog box, next to Select server type, click the drop-down, and
select Microsoft SQL Server “2017” Analysis Services.
4. Click the Events Selection tab, and configure the following events, leaving the column options as
default:
Category Event
5. Click Save.
8. In the Connect to Server dialog box, set the Server type to Analysis Services, the Server name to
MIA-SQL, the Authentication to Windows Authentication, and then click Connect.
9. In Object Explorer, expand the Databases folder. If the AW_SSAS database exists, complete the
following steps to delete it:
11. In the Restore Database dialog box, in the Backup file box type D:\Setupfiles\AW_SSAS.abf, and
then click OK.
13. Expand Databases, and verify that the AW_SSAS database appears.
3. List the tools you will use to identify the issue ready for a discussion at the end of the lab.
3. Determine the cause of the issue, backed with evidence from the monitoring, and determine how to
fix the issue.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-2 Managing SQL Business Intelligence Operations
2. In the Connect to Server dialog box, in the Server type list, click Database Engine, in the Server
name list, ensure that MIA-SQL is selected, and then click Connect.
3. In Object Explorer, expand SQL Server Agent and then double-click Job Activity Monitor.
4. In the Job Activity Monitor - MIA-SQL dialog box, under Agent Job Activity, right-click
EIM_Demo BI Load, and then click Stop Job.
9. In the Connect to Server dialog box, in the Server type list, click Integration Services, ensure the
Server name box is set to MIA-SQL, then click Connect.
10. In Object Explorer, expand Stored Packages, expand MSDB, right-click the EIM_Demo_DW_Load
package, and then click Export Package.
11. In the Export Package dialog box, click the ellipses next to Package path.
12. In the Save Package To File dialog box, navigate to D:\Labfiles\Lab06\Starter, then click Save.
16. On the File menu, point to New, and then click Project.
17. In the New Project dialog box, click Integration Services Project.
18. Clear the Create directory for solution and Add to Source Control check boxes.
20. In the Location box, type D:\Labfiles\Lab06\Starter\, and then click OK.
21. In Solution Explorer, right-click the SSIS Packages folder, and then click Add Existing Package.
22. In the Add Copy of Existing Package dialog box, click the ellipses next to Package path.
23. In the Load Package dialog box, navigate to D:\Labfiles\Lab06\Starter, and then double-click
EIM_Demo_DW_load.dtsx.
24. In the Add Copy of Existing Package dialog box, click OK.
25. In Solution Explorer, right-click the EIM_Demo_DW_Load.dtsx package, and then click Open.
26. Scroll down until the Truncate Tables step is visible, right-click this step, and then click Edit.
27. In the Execute SQL Task Editor dialog box, in the SQL Statement field, to the right of the field, click
the ellipses.
28. In the Enter SQL Query dialog box, delete all the text below the line --Test code, and then click OK.
29. In the Execute SQL Task Editor dialog box, click OK.
30. On the File menu, click Save All and then close Visual Studio.
MCT USE ONLY. STUDENT USE PROHIBITED
L6-3
33. In the Connect to Server dialog box, in the Server type list, click Integration Services, ensure the
Server name box is set to MIA-SQL, then click Connect.
34. In Object Explorer, expand Stored Packages, expand MSDB, right-click the EIM_Demo_DW_Load
package, and then click Import Package.
35. In the Import Package dialog box, in the Package path box, type
D:\Labfiles\Lab06\Starter\Exercise 1 Solution\EIM_Demo_DW_Load.dtsx.
Note: Make sure that you select the package in the Exercise 1 Solution folder, and not the
Starter folder.
36. In the Package name box, verify that the text is EIM_Demo_DW_Load, and then click OK.
37. In the Import Package dialog box, click Yes to overwrite the existing package.
38. In Object Explorer, click Connect, and then click Database Engine.
39. In the Connect to Server dialog box, in the Server name list, ensure that MIA-SQL is selected, and
then click Connect.
41. In the Open File dialog box, in the File name box, type
D:\Labfiles\Lab06\Starter\BI_LoadReset.sql, and then click Open.
45. In Microsoft SQL Server Management Studio, in Object Explorer, under MIA-SQL, expand SQL Server
Agent, and then double-click Job Activity Monitor.
46. In the Job Activity Monitor - MIA-SQL dialog box, click Refresh until the EIM_Demo BI Load
status changes from Executing to Idle. This can take several minutes. Note that the job completes
with a data error, which you will resolve in the next exercise.
47. In the Job Activity Monitor - MIA-SQL dialog box, click Close.
Used the appropriate logging and monitoring tools to identify the issue.
3. Determine the cause of the issue, backed with evidence from the monitoring, and determine how to
fix the issue.
3. In the Connect to Server dialog box, in the Server type list, click Analysis Services.
4. In the Server name list, ensure that MIA-SQL is selected, and then click Connect.
5. In Object Explorer, expand Databases, expand AW_SSAS, expand Dimensions, right-click the
Agents dimension, and then click Process.
8. In Object Explorer, right-click the Customers dimension, and then click Process.
15. In the Process Dimension - Policy Event dialog box, click OK.
17. In Object Explorer under AW_SSAS, expand Cubes, right-click EIM Demo, and then click Process.
18. In the Process Cube - EIM demo dialog box, click OK.
19. In the Process Progress dialog box, note that the cube fails to process due to a data issue
20. In the Process Progress dialog box, expand Command, and then expand each node.
21. Click the bottom-most error message, and then click View Details.
22. In the View Details dialog box, note that an attribute key -1 cannot be found for the AgentCode,
and then click Close.
2. In the Connect to Server dialog box, in the Server name list, select MIA-SQL, and then click
Connect.
3. Expand Databases, expand EIM_Demo, expand Tables, right-click EDW.DimAgents, and then click
Edit Top 200 Rows.
4. In the MIA-SQL.EIM_Demo - EDW.DimAgents window, click the NULL row at the end of the list.
5. In the BrokerID column, type -1, in the Broker Name column, type Not Found, and then press
Enter.
6. In Object Explorer, in the AW_SSAS database, under Dimensions, right-click the Agents dimension,
and then click Process.
10. In the Process Cube - EIM Demo dialog box, click OK. Note that another error occurs.
11. In the Process Progress dialog box, expand Command, and then expand each node.
12. Click the bottom-most error message, and then click View Details.
13. In the View Details dialog box, note that an attribute key -1 cannot be found for the
CustomerCode, and then click Close.
16. In Object Explorer, right-click EDW.DimCustomers, and then click Edit Top 200 Rows.
19. In Object Explorer, right-click the Customers dimension, and then click Process.
22. Right-click the EIM Demo cube, and then click Process.
23. In the Process Cube - EIM Demo dialog box, click OK. Note that the cube is now processed
successfully.
Used the appropriate logging and monitoring tools to identify the issue.
4. In the User Account Control dialog box, click Yes, and then wait for the script to finish.
2. In the Connect to Server dialog box, ensure that the Server type is Database Engine, the Server
name is MIA-SQL, and the Authentication is Windows Authentication, and then, click Connect.
3. On the File menu, point to Open, and then click File.
4. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query1.sql.
5. On the Query menu, click Include Actual Execution Plan.
6. Click Execute.
7. Review the execution plan, making a note of what part of the query execution could be improved for
optimal execution of the query.
12. Review the execution plan, making a note of what part of the query execution could be improved for
optimal execution of the query.
13. On the File menu, point to Open, and then click File.
14. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query3.sql.
17. Review the execution plan, making a note of what part of the query execution could be improved for
optimal execution of the query
2. Discuss the findings you can identify from the execution plans.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-2 Managing SQL Business Intelligence Operations
4. Based on the available information, identify the main issue with each query and determine whether to
refactor or use indexes to fix the query.
4. In the Configure Overall Resource Consumption window, in the Time Interval section, click Last
hour, and then click OK.
5. Note that the report contains four reports that show:
a. Duration
b. Execution Count
c. CPU Time
d. Logical Reads
6. In the Duration report, position the cursor on the highest bar in the bar chart.
7. Note that a tooltip appears that provides the query execution information, including:
a. Interval Start
b. Interval End
d. Duration (ms)
e. Logical Writes
f. Logical Reads
h. Physical Reads
i. Execution Count
2. In the Top Resource Consuming Queries window, notice that there is an execution plan in the
bottom part of the report showing the plan for the longest-running query.
3. At the top left, notice that a bar chart is displayed showing queries of the longest duration.
4. Find the query with the highest duration in the bar chart, and click it.
5. Note the query number from the plan summary on the top right for use in the next task.
6. At the top right of the Top Resource Consuming Queries window, observe the chart for Plan
Summary. Note that the query is not using a forced plan.
7. Under the chart for the plan summary, click the Force Plan button.
2. At the top left, in the Tracked Queries box, type the query number from the previous task, and then
click the play button.
3. A chart appears showing the execution of the query. There is also the ability to force and unforce a
plan for the query.
5. Click Execute.
6. Review the execution plan, making a note of improvements to the query execution plan.
MCT USE ONLY. STUDENT USE PROHIBITED
L7-4 Managing SQL Business Intelligence Operations
2. In the Open File dialog box, browse to the D:\Labfiles\Lab07\Starter folder, and then double-click
Query2.sql.
5. Click Execute.
6. Highlight the SELECT statement at the top of the query window, and then click Execute.
7. Review the query execution plan, making a note of the improvements made.
8. Close SQL Server Management Studio.