Microsoft SQL Database
Microsoft SQL Database
com
Microsoft Azure SQL® ™
Leonard G. Lobel
Eric D. Boyd
www.itbookshub.com
PUBLISHED BY
Microsoft Press
A Division of Microsoft Corporation
One Microsoft Way
Redmond, Washington 98052-6399
Copyright © 2014 by Leonard G. Lobel and Eric D. Boyd
All rights reserved. No part of the contents of this book may be reproduced or transmitted in any form or by any
means without the written permission of the publisher.
Library of Congress Control Number: 2014940679
ISBN: 978-0-7356-7942-9
First Printing
Microsoft Press books are available through booksellers and distributors worldwide. If you need support related
to this book, email Microsoft Press Book Support at mspinput@microsoft.com. Please tell us what you think of
this book at http://aka.ms/tellpress.
The example companies, organizations, products, domain names, email addresses, logos, people, places, and
events depicted herein are ictitious. No association with any real company, organization, product, domain name,
email address, logo, person, place, or event is intended or should be inferred.
This book expresses the author’s views and opinions. The information contained in this book is provided without
any express, statutory, or implied warranties. Neither the authors, Microsoft Corporation, nor its resellers, or
distributors will be held liable for any damages caused or alleged to be caused either directly or indirectly by
this book.
www.itbookshub.com
To my partner of 20 years, Mark, and our children, Adam, Jacqueline, Joshua,
and Sonny. With all my love, I thank you guys, for all of yours.
—Leonard LobeL
For my loving wife, Shelly, and our wonderful boys, Jaxon and Xander.
—eric boyd
www.itbookshub.com
This page intentionally left blank
www.itbookshub.com
Contents at a glance
Introduction xiii
Index 357
www.itbookshub.com
This page intentionally left blank
www.itbookshub.com
Contents
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . xiii
microsoft.com/learning/booksurvey
vii
www.itbookshub.com
Changing the database edition and maximum size. . . . . . . . . . . . . .44
Deleting a database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .44
Using PowerShell. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .44
Installing the Microsoft Azure PowerShell cmdlets . . . . . . . . . . . . . .44
Using the PowerShell Integrated Scripting Environment . . . . . . . . .46
Coniguring PowerShell for your Microsoft account . . . . . . . . . . . . .46
Creating a new server . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 47
Creating a new database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .48
Deleting a database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .50
Budgeting for SQL Database. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .50
SQL storage . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .50
Client bandwidth . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Backup storage space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
Backup storage bandwidth. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .52
Support . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .53
Optimizing your costs . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .54
Coniguring the database edition and size . . . . . . . . . . . . . . . . . . . . . 55
Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .56
viii Contents
www.itbookshub.com
SQL Server Bulk Copy (bcp) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .80
Migrating Schema . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 81
Exporting data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .83
Importing data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .84
SQL Database Migration Wizard . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .86
Downloading the tool . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .87
Migrating a database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .88
Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .95
Contents ix
www.itbookshub.com
Using Visual Studio Report Server projects . . . . . . . . . . . . . . . . . . . . . . . . .150
Installing AdventureWorks2012 for SQL Database . . . . . . . . . . . . .152
Installing SSDT Business Intelligence for Visual Studio 2012 . . . . .154
Creating a report using Visual Studio. . . . . . . . . . . . . . . . . . . . . . . . .156
Implementing report security . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .170
Shutting down the SSRS virtual machine . . . . . . . . . . . . . . . . . . . . . . . . . . .171
Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .171
x Contents
Adding an ADO.NET Web API controller . . . . . . . . . . . . . . . . . . . . . .230
Testing the Customer Web API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .233
Managing SQL Database connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . .234
Opening late, closing early . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .234
Pooling connections . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .234
Recovering from connection faults . . . . . . . . . . . . . . . . . . . . . . . . . . .234
Adding the Transient Fault Handling Application Block . . . . . . . . .235
Using the Transient Fault Handling Application Block
with ADO.NET . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .237
Using the Transient Fault Handling Application Block
with Entity Framework. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .239
Reducing network latency . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .243
Keeping services close . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .243
Minimizing round trips . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .243
Effectively using SQL Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .244
Using the best storage service . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .244
Optimizing queries . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .245
Scaling up SQL Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .245
Partitioning data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .250
Scaling out with functional partitions . . . . . . . . . . . . . . . . . . . . . . . .250
Scaling out with shards . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .251
Summary. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .260
Contents xi
Chapter 10 Building cloud solutions 289
Creating the SQL Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .292
Extending the SQL Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .294
Creating a new solution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .294
Creating a SQL Server Database project . . . . . . . . . . . . . . . . . . . . . .295
Setting the target platform . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .296
Importing from SQL Database into the project . . . . . . . . . . . . . . . .297
Adding a new column to the Wine table . . . . . . . . . . . . . . . . . . . . . .300
Deploying the project to Microsoft Azure SQL Database . . . . . . .301
Creating the Order table . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .305
Creating stored procedures for the Order table . . . . . . . . . . . . . . . .307
Creating the data access layer . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .312
Introducing the Entity Data Model . . . . . . . . . . . . . . . . . . . . . . . . . . .313
Creating the Data Access Layer project . . . . . . . . . . . . . . . . . . . . . . .314
Creating an Entity Data Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .315
Creating the website. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .321
Creating an ASP.NET web application project . . . . . . . . . . . . . . . . .321
Referencing the data access layer . . . . . . . . . . . . . . . . . . . . . . . . . . . .323
Creating the user interface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .324
Testing the website locally . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .328
Deploying the website to Microsoft Azure . . . . . . . . . . . . . . . . . . . .331
Creating the ASP.NET Web API services . . . . . . . . . . . . . . . . . . . . . . . . . . . .336
Adding a Web API controller . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .337
Testing the Web API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .339
Deploying the Web API . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .340
Creating the Windows Phone application . . . . . . . . . . . . . . . . . . . . . . . . . .341
Installing the Windows Phone SDK 8.0 . . . . . . . . . . . . . . . . . . . . . . .341
Creating the Windows Phone Project . . . . . . . . . . . . . . . . . . . . . . . .343
Adding Json.NET . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .343
Creating the App’s main page . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .344
Testing the Windows Phone application . . . . . . . . . . . . . . . . . . . . . .353
Index 357
microsoft.com/learning/booksurvey
xii Contents
Introduction
M icrosoft Azure SQL Database is the cloud version of Microsoft SQL Server, which
is Microsoft’s well-established on-premises relational database engine platform.
Despite some noteworthy differences, SQL Database (the short name for Microsoft
Azure SQL Database) is largely compatible with SQL Server, so for the most part, any
experience you have working with SQL Server can be directly and immediately applied
to SQL Database. If you are a software professional looking to consider the cloud as a
platform for the database in your next application, SQL Database can be just the right
tool for you. And if you want to get up to speed quickly with this emerging platform,
with or without SQL Server experience, this is just the right book for you.
One big difference between on-premises software and cloud services is that the
latter can be updated and enhanced much more frequently than the former, given that
no installation or customer infrastructure is required in the cloud case. Cloud services
are subject to frequent changes in pricing as well. As such, features, limitations, costs,
the tooling user interface, or even the branding of Microsoft Azure SQL Database, as
described in this book, may have evolved by the time you read it. For example, shortly
before going to press, the platform formerly branded as Windows Azure was changed
to Microsoft Azure. (Although the book title and textual references were updated
accordingly, many screen shots still show the older name, Windows Azure.) Regardless
of the potential for such changes, the principles and techniques covered throughout
this book will help you achieve comfort with and mastery of Microsoft Azure SQL
Database.
xiii
Note As Azure evolves, we evolve with it. Even as this irst edition goes to
press, we are busy planning the next edition with expanded coverage of the
recently announced Basic, Standard, and Premium editions. These new ser-
vice tiers (which have limited preview availability at the time of this writing)
can support larger and more scalable databases than the current Web and
Business editions offer. Our next edition will also be revised for the upcoming
release of a new management portal currently being developed by Microsoft.
The book is also useful for those familiar with on-premises SQL Server and are
interested in creating new applications to work with SQL Database, or those who would
like to migrate existing applications that currently work with on-premises SQL Server to
work with SQL Database as well.
Assumptions
No prior knowledge or experience with Microsoft Azure and cloud computing is
assumed or required. Furthermore, although experience with Microsoft SQL Server is
certainly useful, that too is not required.
Several chapters involve .NET programming. Here, too, prior experience with
Microsoft Visual Studio and C# is helpful but not required. The procedures in these
chapters include complete code listings, and clear explanations of the code are
provided.
xiv Introduction
This book might not be for you if…
This book might not be for you if you already have extensive knowledge and
experience with SQL Database, and are seeking to delve deeper into internals or other
specialized focus areas not covered in this book. Still, this book contains useful infor-
mation even for experienced users. Therefore, we recommend that you take a quick
glance at the chapter descriptions in the next section. Doing so should help you quickly
determine if there are speciic areas of interest we cover that you would like to learn
more about.
■ Chapter 2—Coniguration and pricing With the basics covered, this chapter
explains additional options for coniguring SQL Database, beyond the browser-
based portals introduced in Chapter 1. You will learn how to connect to SQL
Database using familiar local tools, such as SQL Server Management Studio
(SSMS) and SQL Server Data Tools (SSDT) inside Visual Studio. You will also learn
how to conigure and manage SQL Database using PowerShell, by downloading
the Microsoft Azure PowerShell cmdlets. The chapter concludes with an expla-
nation of how SQL Database pricing is structured on Microsoft Azure, and it
provides tips to help you budget for a SQL Database solution.
Introduction xv
■ Chapter 3—Differences between SQL Server and Microsoft Azure SQL
Database Readers with prior SQL Server experience will want to know about
the important differences between the on-premises relational engine they are
familiar with and the SQL Database implementation on Microsoft Azure. This
brief chapter enumerates these differences and explains the rationale behind
them. Where possible, we suggest workarounds for SQL Server features that are
not supported in SQL Database.
■ Chapter 6—Cloud reporting When you have data in a database, it’s only a
matter of time before you also have reporting requirements related to that data.
And when that database is hosted in the cloud on Microsoft Azure, it’s only
natural to consider using the Azure cloud to host a reporting solution as well. In
this chapter, you will learn how to create an Azure virtual machine (VM) to host
SQL Server Reporting Services (SSRS) in the cloud. (No prior SSRS experience is
needed.) Once the VM is conigured, you will learn how to build SSRS reports
using two report authoring tools: Report Builder and SSDT Business Intelligence
for Visual Studio. After building and previewing reports locally, you will learn
how to deploy them to the VM for a complete reporting solution in the cloud.
■ Chapter 7—Microsoft Azure SQL Data Sync In this chapter, you will learn
how to use the SQL Data Sync service available on Microsoft Azure to replicate
data between multiple databases. You will learn about the hub-and-spoke
architecture upon which the service is based, and see how SQL Data Sync can be
used to implement solutions for a variety of scenarios, including one-way or bi-
directional replication across a set of databases in multiple locations. The proce-
dures in this chapter walk you through the process of coniguring the SQL Data
Sync service and creating sync groups that replicate between multiple databases
xvi Introduction
hosted both in the cloud (on Microsoft Azure SQL Database) and on-premises
(using SQL Server). You will also learn how to establish a conlict-resolution strat-
egy and set up an automated synchronization schedule.
■ Chapter 10—Building cloud solutions In the book’s closing chapter, you will
learn how to build a complete solution in the cloud on top of Microsoft Azure
SQL Database. Speciically, you will create a Visual Studio solution that includes
a SQL Server Database project, an Entity Framework data-access layer, ASP.NET
MVC, and ASP.NET Web API. The solution provides a website, web services, and
a Windows Phone 8 app with functionality for users to retrieve and update data
stored in SQL Database.
■ Text that you type (apart from code blocks) appears in bold.
Introduction xvii
■ A plus sign (+) between two key names means that you must press those keys at
the same time. For example, “Press Alt+Tab” means that you hold down the Alt
key while you press the Tab key.
■ A vertical bar between two or more menu items (for example, File | Close) means
that you should select the irst menu or menu item, then the next, and so on.
System requirements
At a minimum, there are no special system requirements for working with SQL
Database. The Microsoft Azure management portal requires only a web browser and
Internet access. Similarly, the SQL Database management portal requires only a browser
with the Silverlight plug-in.
Some chapters walk you through procedures that use local tools—typically, SQL
Server Management Studio (SSMS) and Visual Studio 2013. To complete these proce-
dures, you will need to have those tools installed as well, which requires the following:
■ Visual Studio 2013, any edition. (Multiple downloads may be required if using
Express Edition products.)
■ SQL Server 2012 Express Edition or higher, with SQL Server Management Studio
2012 Express or higher. (Included with Visual Studio, Express Editions require
separate download.)
Chapter 4, "Migrating databases," and Chapter 7, "Microsoft Azure SQL Data Sync,"
include procedures that require a local SQL Server instance on which you have permis-
sions to create a database. If you don’t have access to a local SQL Server instance, you
can install SQL Server Express Edition (the free version of SQL Server) by following the
instructions shown in the next section.
Finally, several individual chapters work with additional software that gets installed
locally. These chapters include detailed procedures for downloading and installing the
necessary software so that you can follow along with the rest of the chapter.
xviii Introduction
Downloads: SQL Server Express Edition
There are several SQL Server Express Edition downloads available on the Microsoft site,
and they are available in both 32-bit and 64-bit versions. You can choose to install just
the SQL Server Express database engine (and nothing else), or you can choose one of
two other (larger) downloads: Express With Tools (which includes SQL Server Manage-
ment Studio [SSMS]) or Express With Advanced Services (which includes SSMS, Full
Text Search, and Reporting Services). There are also separate downloads for SSMS and
LocalDB, but these do not include the SQL Server Express database engine needed to
host local databases.
To install the SQL Server Express Edition database engine, follow these steps:
3. Select the appropriate download for your system, as shown in Figure I-1:
Introduction xix
www.itbookshub.com
FIGURE I-1 Downloading SQL Server 2012 Express Edition (64-bit version)
4. Click Next.
5. If you receive a pop-up warning, click Allow Once, as shown in Figure I-2.
6. When prompted to run or save the ile, choose Run. This starts and runs the
download.
7. If the User Account Control dialog appears after the download iles are
extracted, click Yes.
8. In the SQL Server Installation Center, click New SQL Server Stand-Alone
Installation, as shown in Figure I-3.
xx Introduction
FIGURE I-3 Choosing a new SQL Server installation
a. On the License Terms page, select I Accept The License Terms and click
Next.
b. On the Product Updates page, allow the wizard to scan for updates, and
then click Next.
c. On the Install Setup Files page, wait for the installation to proceed.
e. Continue clicking Next through all the remaining pages until the
Installation Progress page, and wait for the installation to proceed.
Introduction xxi
FIGURE I-4 SQL Server Express installation in progress
http://aka.ms/AzureSQLDB_SBS
xxii Introduction
found in each of the chapters. Finally, the code folders for Chapters 6, 8, and 10 also
include the completed Visual Studio solutions for the exercises found in those chapters.
Acknowledgments
I was irst asked to write a book on SQL Azure—back when it was still called SQL
Azure—nearly two years ago. It’s been a long road since then, and despite seismic
shifts both in the Azure product platform and in the book publishing ecosystem (not to
mention an unexpected curve ball or two), I am extremely delighted to inally publish!
This is my third technical book, and although each experience has been unique, I’ve
learned the same lesson in each case: I could not have even contemplated the challenge
without the aid of numerous other talented and caring individuals. These are folks who
deserve special recognition—people who lent their generous support out in so many
different ways that it’s impossible to mention names in any prescribed order.
So I’ll start with Andrew Brust. If not for Andrew (who himself is a well-established
leader in the software industry), I would never have started down the book-writing
path in the irst place. I am grateful for our personal friendship, as well as our working
relationship writing books and presenting workshops together. These experiences truly
help me thrive and grow.
I’m also fortunate to have teamed up with my colleague and co-author Eric Boyd,
who produced four excellent chapters on several advanced topics. Eric is an extremely
talented software professional, whose expertise and passion for technology comes
through clearly in his writing.
Russell Jones, my pal at O’Reilly Media, gets special mention of course, because he’s
the one who asked me to write this book in the irst place. I thank Russell, not only for
offering me the opportunity, but for his expert guidance and assistance during the
transition to Microsoft Press. More thanks go out to Roger LeBlanc for his copyediting
review, and to Scott Klein for his technical review. Special thanks as well to Devon
Musgrave and Rosemary Caperton at Microsoft Press, and Steve Sagman of Waypoint
Press. Their guidance has been vital to the successful production of this book, and it has
been an absolute pleasure working with each one of them.
I would like to give special mention to the Microsoft MVP program, which was an
indispensable resource during the writing of this book. So thank you Microsoft, and to
my MVP lead Simon Tien as well, for his constant encouragement.
Introduction xxiii
This book could not have been written, of course, without the love and support of
my family. I owe an enormous debt of gratitude to my wonderful partner Mark, and our
awesome kids Adam, Jacqueline, Josh, and Sonny, for being so patient and tolerant with
me throughout this project.
And greatest thanks of all go out to my dear Mom, bless her soul, for always
encouraging me to write with “expression.”
—Leonard Lobel
I have been developing software professionally for almost 20 years and I am grateful for
being blessed with deep interest and excitement for this industry, the ability learn and
understand what are sometimes very complex concepts, and the support of my family,
friends, mentors and peers throughout my career. Writing a book like this requires lots
of guidance and help from many people, and I have many people to thank.
First and foremost, I want to thank God for everything: for life, salvation, family,
friends, talents, abilities and everything.
Working on this project over the past year has been a lot of fun, but it has also been
a lot of work. My family has been extremely supportive, even when I had to block off
nights and weekends to write. I owe so much to my wife, Shelly, for everything that she
does for our family. And I’m so thankful for our two wonderful boys who enjoy sitting
next to me in my ofice and cuddling up next to me with my laptop in the living room,
when my evenings and weekends get occupied with writing.
In addition to family, I want to thank friends and co-workers who have also been
very supportive during this project, even when I bring my laptop to their living rooms,
kitchens and dining rooms so that I can write a few more words, paragraphs and pages.
I want to thank Lenni Lobel who invited me to join him on this project. Lenni has
been a fantastic co-author and has done a great job leading this project and driving it
to completion. His guidance, editing and feedback has been extremely valuable for me
personally and for the project. I’m also very appreciative of his patience throughout this
project.
Last, but certainly not least, thank you to everyone at Microsoft and Microsoft Press
who have helped with this project both directly and indirectly, this list includes Scott
Klein, Dora Chan, Mark Brown, Devon Musgrave, Rosemary Caperton, Steve Sagman,
Conor Cunningham, the Azure CAT team, and so many more.
—Eric Boyd
xxiv Introduction
Errata, updates, & book support
We’ve made every effort to ensure the accuracy of this book and its companion
content. You can access updates to this book—in the form of a list of submitted errata
and their related corrections—at:
http://aka.ms/AzureSQLDB_SBS
If you discover an error that is not already listed, please submit it to us at the same
page.
Please note that product support for Microsoft software and hardware is not
offered through the previous addresses. For help with Microsoft software or hardware,
go to http://support.microsoft.com.
http://aka.ms/tellpress
We know you’re busy, so we’ve kept it short with just a few questions. Your answers
go directly to the editors at Microsoft Press. (No personal information will be request-
ed.) Thanks in advance for your input!
Stay in touch
Let’s keep the conversation going! We’re on Twitter: http://twitter.com/MicrosoftPress
Introduction xxv
This page intentionally left blank
CHAPTER 1
I n this chapter, you will create your irst database on the Microsoft Azure SQL Database platform—
completely from scratch. From scratch means that all you need to follow along is a web browser (the
chapter uses Internet Explorer) and Internet access. You will sign up for a Microsoft account (if you
don’t already have one), and use your Microsoft account to create and access a free trial subscription
to Microsoft Azure. Then we’ll introduce you to the Microsoft Azure management portal, and you’ll
quickly get a server and database up and running in the cloud. Finally, you’ll use the SQL Database
management portal to design, populate, and query the database.
Note We’ll often refer to Microsoft Azure SQL Database simply as SQL Database. The
term SQL Server refers exclusively to on-premises database instances, while the term SQL
Database always means the cloud-based Microsoft Azure SQL Database.
Although this book is focused on Microsoft Azure SQL Database, you’ll ind it helpful to
understand SQL Database in the broader context of the Microsoft Azure platform, and cloud
computing in general. This understanding will greatly enhance your appreciation of SQL Database.
So, before signing up for a Microsoft account, here’s a brief high-level discussion of cloud computing
with Microsoft Azure.
1
Internet as a way of connecting clients to various infrastructure, platform, and application services
with a far greater degree of lexibility and abstraction than previous hosting schemes could possibly
offer.
One of the earliest cloud-computing platforms was Amazon Web Services (AWS), introduced back
in 2002 by Amazon.com. Still today, AWS is prominently positioned as a serious contender in the
cloud service industry. Since the mid-2000s, cloud computing has been rapidly gaining popularity,
and in 2009, Microsoft unveiled Microsoft Azure (which was called Windows Azure until the name
was changed in April 2014). Even as Azure launched, and steadfastly ever since, Microsoft has been
expanding its cloud platform with newer and more robust capabilities.
In short, the idea of applications and services running in “the cloud” means that you’re dealing
with intangible hardware resources, which in turn, translates to a maintenance-free runtime environ-
ment. You sign up with a cloud-hosting company (Microsoft, in the case of Azure) for access, pay
them for how much power (in terms of resources) your applications require (RAM, CPU, storage,
bandwidth, scale-out load balancing, and so on), and let them worry about all the rest. Compared to
the manual labor and potential for error involved in doing things yourself, it’s both hassle free and
risk free.
But the Microsoft Azure environment is much more than a set of conventional web-hosting
facilities on steroids. In fact, your cloud-based applications and services don’t actually run directly on
these server machines. Instead, sophisticated hypervisor virtualization technology runs on top of all
this physical hardware. Your “code in the cloud,” in turn, runs on that virtualization layer. So scaling
out during peak season becomes a simple matter of changing a coniguration setting that increases
Now consider the same scenario with conventional infrastructure. You’d need to purchase and
install servers, bring them online, and add them as members to a load-balanced farm. And then you’d
need to take them ofline to be decommissioned later when the extra capacity is no longer required.
That requires a great deal of work and time—either for you directly or for your hosting company—
compared to tweaking some coniguration with only a few mouse clicks.
Because cloud solutions can be delivered in lots of different ways, many new terms and buzzwords
have iniltrated our vocabulary in recent years. Among them are the various “as-a-service” acro-
nyms, including Infrastructure as a Service (IaaS), Platform as a Service (PaaS), and Software as a
Service (SaaS). All these terms obviously refer to services; their differences lie in the level of service.
It’s often helpful to think of these terms as gradations of abstraction, starting with the lowest level
of the underlying hardware infrastructure. When you’re on-premises, you have no abstraction at all,
and you are intimately involved with and responsible for everything from the hardware on up. When
you move to the cloud, you can go IaaS, Paas, or SaaS as your needs dictate, where each of those
approaches provide increasingly greater abstractions.
Infrastructure as a Service
With IaaS, Microsoft Azure effectively gives you virtual machines (VMs) that are entirely under your
control. Just as in an on-premises environment, you’ll be responsible for installing the OS and conig-
uring the machine. It’s easy to build virtual machine images from scratch—or to customize existing
virtual machines from a library of preconigured VMs—and then deploy them to run in the cloud
with full network connectivity (even Virtual Private Network [VPN] connections) and conigurability.
But unlike working on-premises, you’ll never need to handle screwdrivers, hard drives, cables, racks,
power supplies, motherboards, RAM, or anything like that ever again. This is true IaaS—abstraction of
hardware (networking, storage, severs, virtualization), and nothing else.
With this capability, you could certainly create a VM on Microsoft Azure that runs Microsoft
SQL Server; that is, a virtual machine in the cloud that itself is running the full on-premises version
of SQL Server. There might be situations where that is entirely justiied and valid—for example, if you
require full compatibility with on-premises SQL Server, which SQL Database does not provide. (We
discuss the differences between these two platforms in Chapter 3, “Differences between SQL Server
and Microsoft Azure SQL Database.”) A prime example of this scenario is to deliver cloud reporting
with SQL Server Reporting Services (SSRS) running in a Microsoft Azure VM (which you learn how to
do in Chapter 6, “Cloud reporting”). But understand that running SQL Server in an Azure VM is com-
pletely different than using Microsoft Azure SQL Database. Going with IaaS and SQL Server means
that you are still responsible for maintaining your virtual machine or machines in the cloud. This
Platform as a Service
With PaaS, the abstraction level gets raised above IaaS so that you are also shielded from the
operating system, middleware, and runtime layers. This means that Microsoft Azure also provides
a platform for your applications and services to run on. You have no control over the platform; you
get to manage only applications and data, while the cloud provider manages the rest of the infra-
structure. You still get to create and test your applications locally and then upload them to run on
Microsoft Azure. (We cover this in Chapter 10, “Building cloud solutions.”). This gives your application
incredible scalability without requiring the investment in expensive hardware that such scalability
would normally require in any on-premises scenario.
SQL Database, too, is a PaaS solution. It’s still SQL Server, but to deliver a relational database
platform (as opposed to infrastructure), certain features that are available on-premises are not
supported. With SQL Database, you can provision servers and databases on the ly, without ever
interacting with the OS or other underlying infrastructure. You will never need to know or care if your
data and log iles are stored on a C drive or a D drive, because SQL Database handles all details of
physical storage for you. As you’ll learn about in Chapter 3, enjoying the beneits of virtually instan-
taneous provisioning and risk-free, care-free maintenance also means incurring some loss of control
that you normally get to exercise when working with SQL Server on-premises.
Software as a Service
SaaS is at the high end of the abstraction spectrum, where everything from the hardware up to and
including the end-user application is handled by the service. There are many cloud SaaS offerings
available today, including Ofice 365, CRM, and Salesforce.com.
You can create your own SaaS solutions with SQL Database by layering a service or website—also
hosted on Azure—over the database. (You’ll do this in Chapter 10.) You could then offer this as a
complete solution to your customers, who interact only with the application through their browser or
mobile device. Your customers are not concerned with any aspects of infrastructure or platform. They
just connect to your application. So, from their perspective, you have delivered true SaaS.
Tip If you already have a Microsoft account but you want to use a different email address
for any reason, you still don’t need to create a new account. You can either rename the
existing account or create an alias. See http://windows.microsoft.com/en-US/hotmail/
get-new-outlook-address for more information.
If you do create a new Microsoft account, the user name can be an email address you already own.
Alternatively, you can create a new email address for the account that ends either with @outlook.com
or @hotmail.com. It really makes no difference which you choose, as long as the name you provide
has not already been taken by someone else at either @outlook.com or @hotmail.com. If you do
choose to create a new email address, you will also get a new mailbox account at that address, and
Microsoft will communicate with you via that mailbox any time it needs to notify you about important
information regarding your account.
Whether you use an existing email address or create a new one, you’ll also need to assign a strong
password to protect the Microsoft account. Some additional personal information is also required,
such as your name, gender, one of two forms of identity conirmation, your country, and your
postal/Zip code.
3. For the Microsoft account user name (which is what you will be logging on to the Microsoft
Azure portal with), provide an existing email address. Or click the Or Get A New Email Address
link to create a new one available on either @outlook.com or @hotmail.com.
4. Supply a password, and then reenter it to conirm. The account requires a strong password
of at least eight characters that must contain a combination of mixed case, numbers, and
symbols.
6. Provide a phone number or alternate email address. You must provide at least one of these
identity-conirmation methods.
7. Type the random characters generated to prove that you’re a real person.
If you created a new email address in step 3, a mailbox is created for it and you are directed
immediately to the Account Summary page. If you provided an existing email address, you will receive
an email at that address from the Microsoft account team shortly after clicking Create Account. This
In the procedure that follows, you will create a free trial subscription to Microsoft Azure. At the
time of this writing, the free trial gives you $200 of credit for 30 days with access to all services. This
requires providing credit card information that will be used to bill your subscription after your trial
expires.
Important Microsoft Azure pricing and special offers are subject to ongoing change.
We strongly recommend that you visit http://www.windowsazure.com/en-us/pricing/
purchase-options/ to review the latest pricing structures available. Furthermore, special
pricing is available for MSDN subscribers. See http://www.windowsazure.com/en-us/pricing/
member-offers/msdn-beneits/ for more information.
4. If you are not already logged in to your Microsoft account, log in now.
5. You will be taken to the Free Trial Signup page, as shown in Figure 1-2.
6. Choose to either receive a text message or phone call as the method to receive a veriication
code.
7. Enter the code received via the text message or phone call, and click Verify Code.
8. Provide the credit card payment details for billing after the free trial expires.
9. Select the box to indicate that you agree to all the terms.
It takes just a few moments to complete setting up your new Azure subscription, and then you’re
ready to get started working with SQL Database and all the other Microsoft Azure services.
Creating a server
It’s easy to create a server, which is akin to an instance of SQL Server in the sense that it can host
multiple databases. All you need to do is create an administrator account user name with a strong
password, and specify the geographical region where the server should be located physically. To
achieve the best performance, you should choose the region closest to your consumers. As we discuss
in Chapter 2, “Coniguration and pricing,” you will also want to be sure that any Microsoft Azure cloud
SQL Database also has special irewall rules you can set to control exactly which computer or
computers can access your database server in the cloud. Minimally, you’ll need to add a rule granting
access to the IP address of your computer so that you can access the server from your local machine.
For production, you might need to add rules granting access to blocks of IP addresses. You will learn
more about irewall rules in Chapters 2 and 5.
Note The irst time you log into the portal, you are welcomed with a message
that offers to give you a brief tour. You can take the tour if you wish, or close the
message to proceed to the main portal page.
FIGURE 1-3 The Microsoft Azure Management Portal with no services yet conigured
2. As illustrated in Figure 1-4, irst click SQL DATABASES in the vertical navigation pane
on the left, then click SERVERS at the top of the page, and then click CREATE A SQL
DATABASE SERVER.
FIGURE 1-4 The CREATE A SQL DATABASE SERVER link on the SQL DATABASES page
4. Supply a password for the new server, and then reenter it to conirm. Typical strong password
rules apply, which require you to use a combination of mixed case, numbers, and symbols.
5. Choose a region from the drop-down list—for example, East US. For best performance, pick
the region you are located in or nearest to.
6. Be sure to leave the ALLOW WINDOWS AZURE SERVICES TO ACCESS THE SERVER check box
selected. This makes the server accessible to the Microsoft Azure cloud services that you’ll
create or use in other chapters (Microsoft Azure was formerly called Windows Azure). The
page should appear similar to Figure 1-5.
7. Click the checkmark icon on the lower-right side of the dialog to complete the settings. After
just a few moments, the new server is provisioned and ready to use, as shown in Figure 1-6.
The check box mentioned in step 6 added the special IP address 0.0.0.0, which allows cloud
services running on Microsoft Azure to access the SQL Database server. However, you still need
to add the IP address of your local machine to access the server from the SQL Database manage-
ment portal and other tools (such as SQL Server Management Studio and SQL Server Data Tools in
Microsoft Visual Studio, which you learn more about in later chapters).
To add a irewall rule for the IP address of your local machine, follow these steps:
1. Click the server name, and then click the CONFIGURE link at the top of the page.
2. To the right of your current detected IP address, click ADD TO THE ALLOWED IP ADDRESSES,
as shown in Figure 1-7. A new irewall rule for your IP address is added.
FIGURE 1-7 Adding your local IP address to the list of IP addresses allowed though the irewall
4. Click the back icon (the large back-pointing arrow) to return to the SQL DATABASES page for
the new server.
You might need to wait a few moments for the new irewall rule to take effect, although typically
it happens very quickly (often within ive to ten seconds). If you don’t wait long enough, however, and
the rule has not yet taken effect, you can be quite certain that you will not be able to connect to the
server from your local machine until it does.
In Chapter 2, you’ll learn more about the different options for database edition and maximum size.
For right now, the important thing to know is that all these settings (except for the default collation)
can be easily changed later on. As part of the elastic scaling provided by SQL Database, you can
freely switch back and forth between the Web and Business editions. You can also switch up and
down between the sizes (1 GB or 5 GB for the Web edition, or 10 GB through 150 GB for the Business
edition) as your changing needs dictate. And if 150 GB is still too small for you, you can partition your
database using special sharding techniques, as we explain in Chapter 8, “Designing and tuning for
scalability and high performance.”
1. If you are continuing from the previous procedure, click the DATABASES link at the top of the
page and then skip to step 4. Otherwise, if you have logged out since then and are starting
fresh, continue with step 2.
4. Click CREATE A SQL DATABASE, as shown in Figure 1-8. This opens the NEW SQL DATABASE
dialog.
FIGURE 1-8 The CREATE A SQL DATABASE link on the portal’s SQL DATABASES page
6. Leave the default settings to create a Web edition database up to 1 GB in size using the
SQL_Lating1_GeneralCP1_CI_AS collation.
Note Chapter 2 discusses the Web and Business editions, the maximum database
sizes, and the signiicance of SQL Database collations.
7. Choose the server you created in the previous procedure from the drop-down list. The page
should appear similar to Figure 1-9.
8. Click the checkmark icon in the lower right of the dialog to complete the settings.
After a few more moments, the new WineCloudDb database is created and ready to use, as shown
in Figure 1-10.
1. If you are continuing from the previous procedure, skip to step 4. Otherwise, if you have
logged out, continue with step 2.
6. Scroll the page down a bit, and ind the MANAGE URL link in the quick glance section at the
right of the page, as shown in Figure 1-11.
7. Click the MANAGE URL link. This opens a new browser tab to the SQL Database portal’s login
page.
Note The SQL Database portal is Silverlight-based. If you don’t have Silverlight
installed, you will irst be prompted to download it before you can use the portal.
8. Type the user name (for example, saz) and password you speciied when you created the
server, as shown in Figure 1-12.
FIGURE 1-13 The Summary view shows properties for the WineCloudDb database
FIGURE 1-15 Deining the Wine table using the portal’s table designer
1. Log in to the SQL Database management portal for the WineCloudDb database, as described
in the previous procedure.
3. Click New Table. The table designer opens with a default name of Table1, an integer ID
column, and two string columns named Column1 and Column2.
5. Change the ID column name to WineId, leaving it as the required primary key.
6. Select the Is Identity? check box for the WineId column. When you insert new wine products
into the table, this setting tells SQL Database to automatically assign incremental integer
values for this column in each new row.
7. Change the Column1 column name to Name, leaving it as a required nvarchar(50) string.
8. Change the Column2 column name to Category, leaving it as a required nvarchar(15) string.
9. Click Add Column. This adds another integer column named Column1 to the table design.
You have now created the Wine table, which should appear similar to the image shown in
Figure 1-15 earlier.
1. Click the [WineCloudDb] tab on the top left side of the page.
2. Click the Design tab on the bottom left. This takes you to the same page you used before
when you created the Wine table. (See Figure 1-14.)
5. Change the ID column name to CustomerId, leaving it as the required primary key.
7. Change the Column1 column name to FirstName, leaving it as a required nvarchar(50) string.
8. Change the Column2 column name to LastName, leaving it as a required nvarchar(15) string.
10. Change the new column name to FavoriteWineId, leaving it as an optional integer (meaning
you do not select the Is Required? check box).
Now the database has Wine and Customer tables. These tables store (obviously) your wine
products and your customers, though (equally obviously) they are both empty at this point.
The SQL Database management portal offers a foreign-key management experience that makes
deining the relationship easy, as shown in Figure 1-16.
To deine the relationship between the two tables, follow these steps:
1. While still in the Customer table design page, click Indexes And Keys at the top of the page.
2. On the right side of the page, click Add A Foreign Key Relationship. The foreign-key designer
appears as shown in Figure 1-16.
3. Select the FavoriteWineId column in the Customer table. This speciies the foreign-key column.
7. Click Save.
The relationship is created, and the designer should now appear similar to Figure 1-17.
To populate the Wine table with sample products, follow these steps:
1. Log in to the SQL Database management portal for the WineCloudDb database, and navigate
to the Wine table design page, as you did in previous procedures.
4. Enter a row with Name, Category, and Year values of Chateau Penin, Bordeaux, and 2008,
respectively.
Note Because you selected the Is Identity? check box for the WineId column
when you designed the table, the designer displays <auto> to indicate that SQL
Database will automatically assign a value for WineId when you save these rows
to the database.
None of the data you entered is actually saved to the database until you click Save in step 6. At
that point, the rows are inserted and the display is refreshed to show the WineId primary-key values
that were automatically assigned by SQL Database. Being the very irst four rows added to the table,
those primary keys were assigned the numbers 1 through 4, as shown in Figure 1-19.
FIGURE 1-19 The Wine table populated with rows of data with automatically assigned primary-key identity values
Now add some data to the Customer table. You’ll use the same procedure you just followed for the
Wine table. The only additional consideration to keep in mind is that each customer has a foreign-key
value that identiies that customer’s favorite wine. Because you informed SQL Database about this
foreign-key relationship in a previous procedure, you can only supply a value of 1 through 4 for each
customer’s FavoriteWineId column (or other integers for rows that are added to the Wine table in the
future). Your only other option is to supply NULL, because FavoriteWineId is optional (meaning that
it’s OK if the customer’s favorite wine is unknown). Otherwise, as we’ll demonstrate, SQL Database will
not permit you to add a customer row with a non-NULL value for FavoriteWineId that does not have a
related row in the Wine table.
4. Enter a row with FirstName, LastName, and FavoriteWineId values of Jeff, Hay, and 4,
respectively.
5. Click Add Row again to enter another row for Mark, Hanson, 3.
6. Click Save to save the rows to the database. The two rows are automatically assigned primary
key values of 1 and 2 for the CustomerId column.
7. Click Add Row again to enter a third row for Jeff, Phillips, but this time type a 6 for the
FavoriteWineId.
9. Expand the Error Details as shown in Figure 1-20. This displays the error message describing
the foreign-key conlict that occurred because there is no row in the Wine table with a WineId
of 6.
FIGURE 1-20 Error message displayed when attempting to violate a deined foreign-key relationship
11. Click Save. Now Jeff Phillips gets saved to the Customer table. (See Figure 1-21.)
FIGURE 1-21 The Customer table populated with several rows of data
The SQL Database management portal has an ad-hoc query window that lets you run T-SQL
queries and view their results. To query for customers and their favorite wines, follow these steps:
1. Log in to the SQL Database management portal for the WineCloudDb database.
2. Click New Query at the top of the page to open a new query window.
SELECT
c.FirstName,
c.LastName,
w.Category,
w.Name
FROM
Customer AS c
LEFT OUTER JOIN Wine AS w ON c.FavoriteWineId = w.WineId
ORDER BY
c.LastName, c.FirstName;
4. Click Run at the top of the page. SQL Database executes the query and displays the results in
the bottom portion of the query window, as shown in Figure 1-22.
5. Modify the query by adding a WHERE clause. Just before the ORDER BY clause, type WHERE
w.Category = ‘Merlot’
6. Click Run again. The query executes once more, this time returning only customers whose
favorite wine is a Merlot, as shown in Figure 1-23.
8. Click Run once more. This time, the query ilters on the wine year, as shown in Figure 1-24.
In step 3, notice how you joined the Customer and Wine tables in the query’s FROM clause. The
LEFT OUTER JOIN ensures that customer rows are returned even if they contain a NULL in Favorite-
WineId—that is, even if they have no favorite wine. Using an INNER JOIN instead would automatically
exclude customers without a favorite wine.
The initial version of this query had no WHERE clause, and with a LEFT OUTER JOIN, there was no
iltering at all. So, at irst, it returned every customer and displayed their favorite wine. (NULL would
be returned for the wine name and category for customers without a favorite wine.)
By adding the WHERE clause in step 5, you asked SQL Database to ilter the results to include only
customers whose favorite wine is any kind of Merlot. The Wine table was aliased as w in the FROM
clause, so w.Category in the WHERE clause refers to, and ilters by, the Category column in the row
joined in from the Wine table. Running this version of the query returns just the two customers with
Merlot as their favorite wine category.
In step 7, you changed the WHERE clause to ilter by w.Year, which is the year of the wine. Notice
how this column is not actually in the result set returned by the SELECT statement, yet it is perfectly
valid to ilter on it. This version of the query now returns the two customers with favorite wines (in any
category) older than 2010. These are Jeff Hay (with a Velle Central Merlot from 2009) and Jeff Phillips
(with a McLaren Valley Cabernet from 2005). Mark Hanson’s favorite wine is the Mendoza Merlot, but
he is iltered out from these results because that wine is from 2010, and the query is returning only
customer rows with favorite wines that are older than 2010.
The SQL Database management portal has matured greatly since the early days of Microsoft Azure
(when it was called the SQL Azure management portal). And you can expect it to continue evolving—
quite possibly even by the time this book goes to press. Before concluding the chapter, we recom-
mend you take the time to examine some of these additional capabilities available in the current SQL
Database management portal at the time of this writing.
Creating views
Views are essentially encapsulated queries that are stored in the database. In most respects, your
queries can treat views just as ordinary tables.
For example, you could create three views that encapsulate the three versions of the query from
the previous section “Querying the database.” You might name those three views as follows:
■ CustomersWithFavoriteView
■ CustomersWIthFavoriteMerlotView
■ CustomersWithFavoritePre2010View
With those views in place, it becomes much easier to query the database. For example, you can
just select from the CustomersWithFavoriteMerlotView, instead of writing the lengthier version of the
query that joins the Customer and Wine tables.
Stored procedures are commonplace in professional relational databases. They are often used to
protect underlying tables from inappropriate usage. They can also build on views, implement business
logic, or further abstract details of the underlying database structure—hiding the way that tables,
views, and columns are named; how the table relationships are deined; and so on. Essentially, and
particularly from the perspective of designing multitiered layered architectures, stored procedures
can be effectively leveraged to implement a service layer over your data, at the database level.
Here’s a stored procedure you can create that joins the Customer and Wine tables to return
customers with their favorite wine, just as you did in the previous query. But this stored procedure
will have some added lexibility; it will accept a @FavoriteWineId parameter so that the results can be
limited to returning just those customers whose favorite wine matches the value passed in through
4. Click New Stored Procedure. The stored procedure designer opens with a default name of
Stored Procedure1.
6. Click Add Parameter to create a new parameter named @Parameter1 with a data type of
nvarchar(50).
8. Click the drop-down list beneath Select Type to change the parameter data type to int.
SELECT
c.FirstName, c.LastName, w.Category, w.Name
FROM
Customer AS c
LEFT OUTER JOIN Wine AS w ON c.FavoriteWineId = w.WineId
WHERE
(@FavoriteWineId IS NULL) OR (c.FavoriteWineId = @FavoriteWineId)
ORDER BY
c.LastName, c.FirstName;
1. Click New Query at the top of the page to open a new query window.
3. Click Run at the top of the page to execute the stored procedure, passing in NULL for the
@FavoriteWineId parameter. SQL Database returns all three customers and the names of their
favorite wine.
4. Modify the EXEC statement from step 2 by changing NULL to 3, which is the WineId for the
2010 Mendoza Merlot.
5. Click Run again. This time, only Mark Hanson is returned, because (currently) he’s the only
customer that has selected 2010 Mendoza Merlot as his favorite wine.
The portal supports the Data-tier Application Component Package ile format, commonly referred
to as DACPAC iles. A DACPAC ile contains the complete deinition of a database, and it can be lever-
aged for streamlined incremental deployments of the database design. A BACPAC ile is similar, except
that in addition to the deinition of the database, it includes actual data as well. In the management
portal, there is full support for importing and exporting DACPAC and BACPAC iles. Locally, SQL
Server Data Tools (SSDT) in Visual Studio can also be used to deine DACPAC iles and deploy them to
Microsoft Azure SQL Database. You will learn more about DACPAC and BACPAC in later chapters.
You can also track events from the SQL Database management portal. This lets you keep an eye on
things like database connections (whether they succeed or fail), deadlocks, and throttling events.
Summary
This chapter got you acquainted with Microsoft Azure SQL Database. We began with an overview of
Azure and cloud computing, and then demonstrated how easy it is to get signed up for a Microsoft
account and an Azure subscription. You then used the Microsoft Azure management portal to quickly
create a new server, and then create a new database on that server.
You also learned how to use the SQL Database portal (which you launched from the Microsoft
Azure portal using a special management URL) to design the database. You created and populated
two related tables, and then you opened a query window to run a few SELECT queries that joined the
tables. You also learned about creating views, creating stored procedures, and the availability of other
database administration features in the SQL Database management portal.
Now that we’ve introduced you to the SQL Database platform, you’re ready to move on to
Chapter 2, where we will delve more deeply into the details of setup and coniguration.
N ow that you have your irst database up and running, you’re ready to explore additional options
for managing the setup and coniguration of SQL Database. You’ll learn more capabilities of the
Microsoft Azure management portal you started working with in Chapter 1, “Getting started with
Microsoft Azure SQL Database” as well as Microsoft SQL Server Management Studio (SSMS) and
Windows PowerShell, all of which can be used to administer SQL Database.
In this chapter, we show you how to use the aforementioned tools to create and drop (delete)
databases. Many other chapters in this book also use SSMS to perform more detailed actions and
speciic tasks. After acquainting you with these tools, the chapter concludes with a discussion of
pricing and provides helpful tips for reducing the cost of using SQL Database.
Quick Create is the fastest and easiest way to create a database. If you are ine with a 1-gigabyte
(GB) Web edition database and the default collation for North American and Western European lan-
guages, use the Quick Create option. (The different editions are explained later in this chapter in the
section “Coniguring the database edition and size.”) If you need to customize the language settings
to support other language types stored in the database, or if you want to preset the database for a
larger size than 1 GB, use the Custom Create option. Finally, if you already have an existing database
(either a Microsoft Azure SQL Database on another server or an on-premises SQL Server database)
that you want to bring into a particular Microsoft Azure SQL Database server, use the Import option.
31
Warning You can change the database edition and size at a later time, but not the
collation. So if most users of this database are outside of the U.S. and western Europe,
make sure to set the appropriate collation to support the required languages when you
create the database.
Quick Create
With Quick Create, you need to supply only two pieces of information to create a database: The name
of the new database, and the name of the existing server that the new database should be hosted on
(although, indeed, Quick Create will allow you to create a new server for the database on the ly, at
the same time the database is created.
To create a new SQL Database using Quick Create, follow these steps:
6. For SERVER, choose any available server from the drop-down list to host the database (or
choose New SQL Database Server from the drop-down list to create a new server on the ly).
After a brief moment, the MyQuickCreateDb database is created and ready for use. This is a 1-GB
Web Edition SQL Database with the default collation.
Custom Create
With Custom Create, you have control over the options for the new database that you don’t get
with Quick Create. These options include the database edition, size, and collation. Indeed, you used
Custom Create in Chapter 1 to create the WineCloudDb database, you just invoked it differently, by
clicking the CREATE A SQL DATABASE link. (Refer back to Figure 1-9.)
Database editions and size are discussed later in this chapter in the section “Coniguring the
database edition and size,” but the important thing to know up front is that you can always change
these settings later on, after the database is created. The collation, however, cannot be changed once
the database is created. Collation is important when you need to use languages other than western
European languages. For example, if your main audience needs its data stored in Mandarin, Cyrillic, or
Arabic, you should set the collation appropriately.
Follow these steps to create a new SQL Database using Custom Create:
4. Click CUSTOM CREATE. This opens the NEW SQL DATABASE dialog, as shown in Figure 2-2.
6. Choose the edition (Web for 1 to 5 GB, or Business for 10 to 150 GB).
8. For SERVER, choose any available server from the drop-down list to host the database (or
choose New SQL Database Server from the drop-down list to create a new server on the ly).
After a brief moment, the MyCustomCreateDb database is created and conigured with the edition,
size, and collation you speciied.
Importing a database
A third way of creating a database in the portal is to import an existing database from a BACPAC ile.
A BACPAC ile is, essentially, a backup of an entire database (schema and data), stored as a binary
large object (BLOB).
You can create a BACPAC ile from a local SQL Server (on-premises) database or from a SQL
Database on Azure. Once you have a BACPAC ile, you can import it to SQL Database. This makes it
Importing (and exporting) databases on Azure is facilitated by storing BACPAC iles in Azure Blob
Storage. This, in turn, requires the creation of a Microsoft Azure Storage Account, which is simply an
account that provides access to cloud storage for your BACPAC iles. The entire process is explained in
Chapter 4, in the section “SQL data-tier applications.”
It is important to understand that the IP address that needs to be speciied is not your IP address
on your local network, but the IP address that the Microsoft Azure datacenter sees when you attempt
to access something. So if you are accessing the database from your ofice, it is the public static
IP address of the ofice router that needs to be speciied. For example, if you have IP addresses as
shown in Figure 2-3, the address you need for the irewall coniguration is the public static IP address
123.456.789.012.
Office
Azure Datacenter
FIGURE 2-3 Public IP address of Internet Gateway Router requires access to SQL Database in the Microsoft Azure
datacenter.
The method by which you ind the appropriate IP address differs depending on whether you are
behind that same public router or not. The next two sections describe how to enable the irewall rules
in these two scenarios.
2. Click SQL DATABASES in the vertical navigation pane on the left. This shows a list of available
databases, which should include the WineCloudDb database you created in Chapter 1.
3. Click on the database name WineCloudDb. This displays a page of quick-start links for the
database.
4. Click the Set Up Windows Azure Firewall Rules For This IP Address link, as shown in Figure 2-4
(Microsoft Azure was formerly called Windows Azure). Again, if you already added the same
IP address as in Chapter 1, you will receive an error message stating that the IP address has
already been added to the irewall rules (which is expected, of course).
FIGURE 2-4 The quick-start link to create a irewall rule for the local network.
5. When prompted to update the irewall rules, as shown in Figure 2-5, click YES.
To open the irewall for a remote user who needs access to the database, you normally need to
contact a network administrator to learn the IP address or addresses. If you can’t or don’t want to
reach out to the administrator, you can open the irewall for the remote user by working coopera-
tively, with the help of whatismyipaddress.com.
2. Have the user open a browser and navigate to the website http://whatismyipaddress.com.
3. Have the user read the IP address that the website reports to them and write it down.
6. Click the server name of the database to which the remote user needs access.
7. Click the CONFIGURE link at the top of the page. You will see a page that shows all the allowed
IP addresses (the whitelist), which should already include the one for your local machine that
was previously added automatically.
8. In the RULE NAME text box, enter a descriptive name (no spaces allowed) for the remote user
or group to which you are granting access to the database—for example, RemoteDevOfice.
9. Enter the IP address you wrote down in step 3 (the one the remote user reported from
whatismyipaddress.com) into both the START IP ADDRESS and END IP ADDRESS text boxes. (In
this scenario, you are creating a rule for a single address, but these text boxes can also be used
to specify a range of IP addresses as desired.) The page should appear similar to Figure 2-6.
To obtain the connection string of the WineCloudDb database for various database clients, follow
these steps:
2. Click SQL DATABASES in the vertical navigation pane on the left. This shows a list of available
databases, which should include the WineCloudDb database you created in Chapter 1.
3. Click on the database name WineCloudDb. This displays a page of quick-start links for the
database.
4. Click the View SQL Database Connection Strings For ADO.Net, ODBC, HPP, And JDBC link,
as shown in Figure 2-7. Also, take note of the server name beneath the link, which is always
sufixed with .database.windows.net. You will need this server name to connect to SQL
Database using SSMS a bit further on in the chapter.
5. The page displays the connections strings, as shown in Figure 2-8. Each connection string
appears in a text box (that scrolls, if necessary), from which you can easily copy and paste into
your client application.
FIGURE 2-8 Viewing the database connection strings for ADO.NET, ODBC, PHP, and JDBC.
2. Click SQL DATABASES in the vertical navigation pane on the left. This shows a list of available
databases, which should include the MyQuickCreateDb and MyCustomCreateDb databases
you created earlier in this chapter.
3. Click to select the row for MyQuickCreateDb. (Don’t click in the NAME or SERVER columns.)
5. Click YES, DELETE when prompted that the database will be permanently deleted.
The management portal is great for working with SQL Database from any computer with a web
browser, without requiring any other special software or tools. Alternatively, there are a number of
local tools available that can also connect to and work with SQL Database. This includes SSMS and
PowerShell, which are covered next in this chapter, as well as SQL Server Data Tools (SSDT), which is
covered in Chapter 10.
You must already have SSMS installed to follow the procedures in this section. If you don’t already
have SSMS, you can download it for free (either by itself, or along with SQL Server Express edition).
Instructions for downloading SSMS can be found in the Introduction.
Once you connect with SSMS, you can use Object Explorer to navigate between objects in SQL
Database just as you can with an on-premises SQL Server database. However, most of the other
graphical designers and dialogs are not available. For example, if you try to design a table or create a
new database, SSMS will open a new query window with template Transact-SQL (T-SQL) script for you
to edit, rather than opening the table designer or the New Database dialog, as you might expect. This
is because these features rely on SQL Server Management Objects (SMO), which SQL Database has
Note SQL Server Data Tools (SSDT), which runs inside Visual Studio, can also connect to
SQL Database, and it works very similar to SSMS. Unlike SSMS, however, SSDT does not rely
on SMO, so the SSDT table designer and other SSDT graphical dialogs are supported for
SQL Database just the same as they are for on-premises SQL Server. You will learn much
more about SSDT in Chapter 10.
1. From the Windows Start screen, launch SSMS. You can either scroll through the app tiles to
ind it (in the Microsoft SQL Server 2012 category) or just type sql server management
studio to run a search, and then click on the tile, as shown in Figure 2-9. After a brief moment,
the Connect To Server dialog appears.
FIGURE 2-9 Launching SQL Server Management Studio from the Windows Start screen.
Tip If you have trouble iguring out the server name, you can easily ind it at
the bottom of the quick-start links page, as shown in Figure 2-7. You can also
discover the server name by viewing the Connection Strings dialog, as shown in
Figure 2-8.
3. For Authentication, select SQL Server Authentication from the drop-down list. (SQL Database
does not support Windows Authentication.)
FIGURE 2-10 Connecting to SQL Database from the Connect To Server dialog in SSMS.
After a brief moment, the connection is made, and you can then use SSMS to manage SQL
Database. If SSMS fails to connect, the most likely cause is that your public IP address has not been
added to the irewall rules, as described earlier in this chapter. The error message will make it clear if
this is the problem. If the connection fails with a more generic error message, ensure that port 1433 is
open on your local irewall. (SQL Database, like SQL Server, uses port 1433 to communicate.)
Once connected, you can drill through Object Explorer to the WineCloudDb database, as shown in
Figure 2-11.
FIGURE 2-11 Drilling down to the tables in a database using Object Explorer in SSMS.
To create a new database using an SSMS query window, follow these steps:
3. In the new query window, type CREATE DATABASE MyDb. The SSMS window should appear
similar to Figure 2-12.
FIGURE 2-12 Creating a new database by executing T-SQL script in an SSMS query window.
4. Press F5 (or click Execute on the toolbar) to execute the statement. It should take only a few
moments for execution to complete.
Deleting a database
Dropping a SQL Database is like dropping any SQL Server database. Either right-click on the database
in Object Explorer and choose Delete, or execute the following T-SQL command in the query window:
Using PowerShell
PowerShell is Microsoft’s modern scripting language for system administration that supports a wide
variety of tasks by executing commands (known as cmdlets, pronounced command-lets) from the
PowerShell command line. Microsoft has also developed PowerShell cmdlets for managing Microsoft
Azure, including a number of useful SQL Database commands.
Even if you already have PowerShell installed, these special cmdlets for Microsoft Azure need to be
installed separately. The following section describes how to download and install the cmdlets.
2. Scroll down to Command-Line Tools, and click the Install link beneath Windows PowerShell, as
shown in Figure 2-13.
5. In the Web Platform Installer dialog (shown in Figure 2-14), click Install.
There is no app tile for the PowerShell ISE on the Windows Start screen, so it needs to be launched
from the command line. To start the PowerShell ISE and view help information for the Azure SQL
Database cmdlets, follow these steps:
1. Open a command prompt. (An app tile for it can be found on the Start screen in the Windows
System category, or you can just type command prompt to search for it.)
3. At the PowerShell ISE prompt, type get-help get-azuresql, and then pause. In a moment,
a popup window appears showing all the cmdlets that start with get-azuresql, as shown in
Figure 2-15.
FIGURE 2-15 Obtaining help on the Microsoft Azure SQL Database PowerShell cmdlets.
4. Double-click on any of the cmdlets to complete the command, and then press Enter to view
help for the selected cmdlet.
4. Click Save to save the .publishsettings ile to your default Downloads folder.
Important The .publishsettings ile should be kept safe and private, because it
effectively provides access to the Azure subscriptions on your Microsoft account.
Note The ile name might be long, but the PowerShell ISE auto-complete feature
helps with an IntelliSense-style drop-down list as you type. Just press the Tab
key to auto-complete your way through the command, and through the folder
names and ile name of the .publishsettings ile.
The PowerShell ISE doesn’t boast with a message when the settings are imported successfully.
You’ll only get an error message if it fails. Otherwise, you’ll know that all went well if you are silently
returned back to the PowerShell command-line prompt.
As you learned in Chapter 1, every SQL Database is hosted on a server. Recall how you used the
Microsoft Azure management portal to irst create a server, and then to create a database on that
server. You also used the portal to set irewall rules to allow access to your SQL Database from des-
ignated IP addresses. You will now perform those very same tasks using just a few simple PowerShell
commands.
Note If you don’t know your IP address, you can ind out what it is by using
whatismyipaddress.com, as explained in the section “Enabling access to a remote
IP address” earlier in the chapter.
To create a new database now, follow these steps in the PowerShell ISE:
4. It’s often useful to view all the databases that exist on the server. To do so, type
Get-AzureSqlDatabase –Context $context. As shown by this cmdlet’s output in Figure 2-19,
the server includes a master database, just as an on-premises SQL Server does.
FIGURE 2-19 Listing all the databases that exist on the server.
The database you just created with New-AzureSqlDatabase is, by default, a Web edition database
with a maximum size of 1 GB and the default collation. This is the same type of database that gets
created when you use Quick Create in the Microsoft Azure management portal. To override these
defaults, specify the –Edition, –MaxSizeGb, and –Collation switches with an edition, maximum size,
and collation of your own choosing. For example, the following statement creates a Business edition
database with a maximum size of 150 GB (the largest possible):
You can also change the edition and maximum size (but not the collation) of an existing database
by using the Set-AzureSqlDatabase cmdlet with the –Edition and –MaxSizeGb switches. For example,
you can use the following command to reconigure the MyNewDb database you just created as a
Business edition database with a maximum size of 20 GB:
If you are using Remove-AzureSqlDatabase to write scripts you intend to run with no user
intervention, you can include the –Force switch. This switch causes the database to be deleted
immediately, without being prompted to conirm.
The information in this section will help you igure out the right coniguration and give you tips as
to how to save money on your SQL Database deployments. You should also take a look at the online
pricing calculator, which quickly calculates pricing based on your input. The online pricing calculator is
available at http://www.windowsazure.com/en-us/pricing/calculator.
SQL storage
The biggest cost of using SQL Database is for the actual disk space required for storage in Microsoft
Azure. Table 2-1 shows current pricing for SQL Database storage.
Client bandwidth
If you connect to your SQL database from within the same datacenter, you do not incur any
bandwidth charges for the data lowing either in to or out of the database. If you connect to the
database from outside the datacenter, your database incurs only “egress” charges for bandwidth
usage, which means that data lowing out of the database to clients is charged, while “ingress” lows
(data coming into the database) are free.
The charges are also different in different areas of the world. Client bandwidth pricing is based on
the location of the Microsoft Azure datacenter, regardless of where the client accessing the database
is located. Again, noting that pricing details are subject to change, the data-transfer pricing details at
the time of this writing are shown in Table 2-2.
Alternatively, you can enable geo-replication on your storage account. This is an attractive option
for backups because it builds in geographic distribution for disaster recovery, and it does so more
cheaply than paying for storage in a separate datacenter as well as paying bandwidth costs to get to
that datacenter.
If you choose to run those backups within the same datacenter, the bandwidth is free.
Another option to consider is to use geo-replicated storage, as mentioned in the previous section,
which also provides the protection of storing backups in different datacenters. In this scenario, the
storage account you set up for your database backups has geo-replication turned on, and the data is
automatically replicated out to another datacenter within the same region. For example, if your stor-
age account is in the North Central datacenter, the replica of the data might be in the South Central
datacenter, where both datacenters are in the U.S.; it would not be in some datacenter in Europe
or Asia.
The main reason why geo-replicated storage is attractive is cost. Imagine you have a 100-MB
database and take backups every hour to a remote storage account. If you have hourly, daily, weekly,
monthly, and yearly backups, you pay the following:
0.1 GB X $0.070/GB/month X 5 backup iles = $0.035/month for the storage plus another
$8.64 per month in bandwidth charges, for a total of about $8.69 for backups
If you pay the higher amount for geo-replicated storage, it actually saves you money. The cost for
the geo-replicated storage for the same database would be as follows:
However, because it is already geo-replicated, you do not have to pay for the bandwidth to do the
replication. Your total amount for backup storage in that case is only $0.0475, instead of $8.69 in the
manually replicated scenario.
Support
When discussing costs, one thing that is frequently overlooked is the cost of a support package.
Microsoft offers several support tiers, which can be viewed at http://www.windowsazure.com/en-us/
support/plans, as shown in Figure 2-20.
When you’re just getting started and need some help, it’s easy to get by on just the forums and
other online resources. But once your applications start becoming more complex and you start sup-
porting lots of users, it is a good idea to have a support plan in place. We have found that in the early
stages of trying to diagnose issues with deployments, it is useful to have Microsoft personnel help
with troubleshooting. We recommend that, after your trial period ends, you start off with at least the
Developer support plan.
■ Don’t store BLOBs in the database. Use Azure Blob Storage for image, video, and text iles that
you might otherwise store as varbinary(max) or image column in the database. The cost of
Blob Storage is much less than SQL Database. A 100-GB SQL Database costs $175 per month,
but Blob Storage costs only $7 per month. To reduce costs and improve performance, put
these large items in your Blob Storage, and just store the Blob Storage record key in your
database to reference it. This strategy will have a huge effect on price if you store iles in
your database.
■ Place your SQL Database in the same datacenter as your websites, mobile services, and other
Azure components that will be clients of the database. Co-locating the applications with
the database not only prevents you from incurring data bandwidth charges for data going
between two datacenters, but also makes your application run faster.
■ Use a strategy for removing old backups such that you maintain history but reduce storage
needs. If you maintain backups for the last hour, day, week, month, and year, you have good
backup coverage while not incurring more than 25 percent of your database costs for backup.
If you have a 1-GB database, your costs would be $9.99 per month for the database and only
$0.10 per month for the backup space.
■ Instead of using a remote storage account for your backups, use geo-replicated storage to
keep from incurring bandwidth charges.
■ If you intend to use a substantial amount of Azure resources for your application, you can
choose to use a volume purchase plan. These plans allow you to save 20 to 30 percent of your
datacenter costs for larger applications.
In the past, Microsoft has talked about the possibility of including more features in the Business
edition than the Web edition. However, at this time, the features are the same for both editions. So
then, what (if any) is the purpose of the edition and size settings? It essentially comes down to con-
trolling cost. The database edition dictates your maximum size, and your maximum size is there for
cost containment, as shown in Table 2-4.
Web 1 GB, 5 GB
Note At the time this book went to press, Microsoft announced the Preview availability of
SQL Database Premium. This is a more costly option than the standard Web and Business
editions of SQL Database, and it supports a maximum database size of 500 GB. As we ex-
plain in Chapter 8, SQL Database Premium also lets you scale up for performance using
dedicated CPU and memory.
When your database reaches the maximum size, it will no longer allow you to insert data, although
you may still update and delete data. You should plan ahead for this scenario. One option is to create
space by deleting unnecessary records. Note that it can take a bit of time after you delete records
for the space to free up, so don’t expect to be able to recover instantaneously after the cleanup. Your
other option is to increase the limit on your database, whether permanently (for example, to accom-
modate expected business growth) or temporarily (for example, to accept new records while you sort
out your longer term strategy for reducing the database size).
As a preventative measure, you should talk to your users about aging policies for certain types
of records so that you can cycle old unused records out of the system. That is a good conversation
to have before you hit your database size limit. Also, watch the size of your database as it grows
over time so that you can anticipate when you’ll hit the limit of your database. Chapter 9 has more
information about how to monitor your system.
As you saw earlier in this chapter, it’s easy to change the edition and size of a SQL Database at any
time using both T-SQL and PowerShell. If you prefer a user interface, you can also use the Microsoft
Azure management portal to conigure the edition and size through the browser.
Summary
In this chapter, you learned how to use the Microsoft Azure management portal, SQL Server
Management Studio, and PowerShell to create and conigure a Microsoft Azure SQL Database.
You saw how to connect each of these tools to Azure, and use them to create and manage servers,
irewall rules, and databases.
The chapter then proceeded to discuss cost, pricing, and budget. We detailed the estimation and
optimization of costs, and we explained all the cost-related elements you need to consider, including
storage and bandwidth, as well as the database edition and maximum size settings.
ne of the most attractive aspects of Microsoft Azure SQL Database is that it shares virtually the
O same codebase and exposes the same tabular data stream (TDS) as on-premises Microsoft SQL
Server. Thus, to a great extent, the same tools and applications that work with SQL Server work just
the same and just as well with SQL Database. Notice that we said to a great extent, because despite
their commonality, there are quite a few SQL Server features that SQL Database does not support.
In this brief chapter, we discuss how and why these two platforms differ from one another, and we
explain the SQL Database constraints you need to be aware of if you have previous experience with
SQL Server.
SQL Server and SQL Database differ in several ways—most notably, in terms of size limitations,
feature support, and T-SQL compatibility. In many cases, these constraints are simply the price you
pay for enjoying a hassle-free, self-managing, self-healing, always-available database in the cloud.
That is, Microsoft cannot responsibly support features that impair its ability to quickly replicate, relo-
cate, and scale a SQL Database instance. This is why SQL Database places limits on database size and
doesn’t support certain specialized features, such as FILESTREAM.
Another common reason why a particular feature or T-SQL syntax might not be supported in
SQL Database is that it’s simply not applicable. With SQL Database, administrative responsibilities
are split between Microsoft and you. Microsoft handles all the physical administration (such as disk
drives and servers), while you manage only the logical administration (such as database design and
security). This is why any and all T-SQL syntax that relates to physical resources (such as path names)
are not supported in SQL Database. For example, you don’t control the location for primary and log
ile groups. This is why you can’t include an ON PRIMARY clause with a CREATE DATABASE statement,
and indeed, why SQL Database does not permit a ile group reference in any T-SQL statement. Plainly
stated, everything pertaining to physical resources (that is, infrastructure) is abstracted away from you
with SQL Database
Yet still, in some cases, a certain SQL Server feature or behavior might be unsupported merely
because Microsoft has just not gotten around to properly testing and porting it to SQL Database.
Azure is constantly evolving, so you need to keep watch for updates and announcements. This small
57
chapter is a great starting point, but the best way to stay current is by reviewing the “Guidelines and
Limitations” section of the SQL Database documentation on the MSDN website. (See http://msdn.
microsoft.com/en-us/library/ff394102.aspx.)
Size limitations
With the exception of the free, lightweight Express edition of SQL Server, there is no practical upper
limit on database size in any edition of SQL Server. A SQL Server database can grow as large as
524,272 terabytes. (For SQL Server Express edition, the limit is 10 gigabytes.)
In contrast, SQL Database has very particular size limitations. As explained in Chapter 2,
“Coniguration and pricing,” you can set the maximum size by choosing between the Web and
Business editions. With a Web edition database, you can set the maximum database size to either 1
or 5 gigabytes (GB). With a Business edition database, the maximum database size can range from 10
to 150 GB. The absolute largest supported database size is 150 GB, although partitioning strategies
can be leveraged for scenarios that require databases larger than 150 GB (as explained in Chapter 8,
“Designing and tuning for scalability and high performance”).
Note At the time this book went to press, Microsoft announced the Preview availability of
SQL Database Premium. This is a more costly option than the standard Web and Business
editions of SQL Database (which have been rebranded as Basic and Standard), and it
supports a maximum database size of 500 GB. As we explain in Chapter 8, SQL Database
Premium also lets you scale up for performance using dedicated CPU and memory.
Connection limitations
SQL Database is far less lexible than SQL Server when it comes to establishing and maintaining
connections. Keep the following in mind when you connect to SQL Database:
■ SQL Server supports a variety of client protocols, such as TCP/IP, Shared Memory, and Named
Pipes. Conversely, SQL Database allows connections only over TCP/IP.
■ SQL Database does not support Windows authentication. Every connection string sent to SQL
Database must always include a login user name and password.
■ SQL Database often requires that @<server> is appended to the login user name in
connection strings. SQL Server has no such requirement.
■ SQL Database communicates only through port 1433, and it does not support static or
dynamic port allocation like SQL Server does.
■ SQL Database does fully support Multiple Active Result Sets (MARS), which allows multiple
pending requests on a single connection.
• The latest version of the Entity Framework (EF6, Microsoft’s recommended data access API
for .NET) has a new Connection Resiliency feature, which automatically handles the retry
logic for dropped connections.
• The Microsoft Enterprise Library Transient Fault Handling Application Block, covered
in Chapter 4, lets you deine and implement retry strategies to deal with dropped
connections.
• The ADO.NET SqlConnection class has an OpenWithRetry extension method that handles
the retry logic based on the default retry policy (which must be deined using the Microsoft
Enterprise Library Transient Fault Handling Application Block).
Unsupported features
This section lists many SQL Server capabilities that are not supported in SQL Database, and here we
suggest workarounds where possible. Again, because this content is subject to change, we recom-
mend you check the MSDN website for the latest information. (See http://msdn.microsoft.com/en-us/
library/ff394102.aspx.)
■ Agent Service You cannot use the SQL Server Agent service to schedule and run jobs on
SQL Database.
■ Audit The SQL Server auditing feature records server and database events to either the
Windows event log or the ile system, and it is not supported in SQL Database.
■ Backup/Restore Conventional backups with the BACKUP and RESTORE commands are
not supported with SQL Database. However, SQL Database supports an automated backup
schedule that creates transactionally consistent backups in the form of BACPAC iles created
in Azure storage. You can also create BACPAC iles manually; however, this does not provide
transactional consistency for changes made during the export operation. To ensure transac-
tional consistency for a manual backup, you can either set the database as read-only before
exporting it to a BACPAC, use the Database Copy feature to create a copy of the database with
transactional consistency and then export that copy to a BACPAC ile. See Chapter 5, “Security
and backup,” for more information.
■ Browser Service SQL Database listens only on port 1433. Therefore, the SQL Server Browser
Service, which listens on various other ports, is unsupported.
■ Change Data Capture (CDC) This SQL Server feature monitors changes to a database,
and it captures all activity related to change tables. CDC relies on a SQL Server Agent job to
function and is unsupported in SQL Database.
CHAPTER 3 Differences between SQL Server and Microsoft Azure SQL Database 59
■ Common Language Runtime (CLR) The SQL Server CLR features (often referred to sim-
ply as SQL CLR) allow you to write stored procedures, triggers, functions, and user-deined
types in any .NET language (such as Microsoft C# or Visual Basic) as an alternative to using
traditional T-SQL. In SQL Database, only T-SQL can be used; SQL CLR is not supported. Note,
however, that this limitation does not apply to SQL Server data types implemented inter-
nally using the CLR (such as xml, geography, and geometry, all of which are supported in SQL
Database).
■ Compression SQL Database does not support the data-compression features found in SQL
Server, which you use to compress tables and indexes.
■ Database object naming convention In SQL Server, multipart names can be used to
reference a database object in another schema (with the two-part name syntax schema.object),
in another database (with the three-part name syntax database.schema.object), and (if you
conigure a linked server) on another server (with the four-part name syntax server.database.
schema.object). In SQL Database, two-part names can also be used to reference objects in
different schemas. However, three-part names are limited to reference only temporary objects
in tempdb (that is, where the database name is tempdb and the object name starts with a #
symbol); you cannot access other databases on the server. And you cannot reference other
servers at all, so four-part names can never be used.
■ Extended events In SQL Server, you can create extended event sessions that help to
troubleshoot a variety of problems, such as excessive CPU usage, memory pressure, and
deadlocks. This feature is not supported in SQL Database.
■ Extended stored procedures You cannot execute your own extended stored procedures
(which are typically custom-coded procedures written in C or C++) with SQL Database. Only
conventional T-SQL stored procedures are supported.
■ File streaming SQL Server native ile-streaming features, including FILESTREAM and
FileTable, are not supported in SQL Database. Instead, you can consider using Azure Blob
Storage containers for unstructured data iles, but it will be your job at the application level to
establish and maintain references between SQL Database and the iles in blob storage, though
note that there will be no transactional integrity between them using this approach.
■ Full-Text Searching (FTS) The FTS service in SQL Server that enables proximity searching
and querying of unstructured documents is not supported in SQL Database. However, there
is a third-party text search engine library available from Lucene that does work with SQL
Database. For more information, visit http://www.lucene.net.
■ Mirroring SQL Database does not support database mirroring, which is generally a non-
issue because Microsoft is ensuring data redundancy with SQL Database, so you don’t need
to worry about disaster recovery. This does also mean that you cannot use SQL Database as a
location for mirroring a principal SQL Server database running on-premises. However, if you
want to consider the cloud for this purpose, you can host SQL Server inside an Azure virtual
machine (VM) against which you can mirror an on-premises principal database. This solu-
tion requires that you also implement a virtual private network (VPN) connection between
■ Partitioning With SQL Server, you can partition tables and indexes horizontally (by groups
of rows) across multiple ile groups within a database, which greatly improves the performance
of very large databases. SQL Database has a maximum database size of 150 GB (or 500 GB, for
the newly announced Premium edition) and gives you no control over ile groups, thus it does
not support table and index partitioning.
■ Replication SQL Server offers robust replication features for distributing and synchronizing
data, including merge replication, snapshot replication, and transactional replication. None
of these features are supported by SQL Database; however, SQL Data Sync can be used to
effectively implement merge replication between a SQL Database and any number of other
SQL Databases on Microsoft Azure and on-premises SQL Server databases. See Chapter 7,
“Microsoft Azure SQL Data Sync,” for more information.
■ Resource Governor The Resource Governor feature in SQL Server lets you manage
workloads and resources by specifying limits on the amount of CPU and memory that can be
used to satisfy client requests. These are hardware concepts that do not apply to SQL Data-
base, so the Resource Governor is unsupported.
■ Service Broker SQL Server Service Broker provides messaging and queuing features, and it
is not supported in SQL Database.
■ System stored procedures SQL Database supports only a few of the system stored
procedures provided by SQL Server. The unsupported ones are typically related to SQL Server
features and behaviors not supported by SQL Database. At the same time, SQL Database
provides a few new system stored procedures not found in SQL Server that are speciic to SQL
Database (for example, sp_set_irewall_rule).
■ Tables without a clustered index Every table in a SQL Database must deine a clustered
index. By default, SQL Database will create a clustered index over the table’s primary key col-
umn, but it won’t do so if you don’t deine a primary key. Interestingly enough, SQL Database
will actually let you create a table with no clustered index, but it will not allow any rows to be
inserted until and unless a clustered index is deined for the table. This limitation does not
exist in SQL Server.
■ Transparent Data Encryption (TDE) You cannot use TDE to encrypt a SQL Database like
you can with SQL Server.
■ USE In SQL Database, the USE statement can refer only to the current database; it cannot be
used to switch between databases as it can with SQL Server. Each SQL Database connection is
tied to a single database, so to change databases, you must connect directly to the database.
■ XSD and XML indexing SQL Database fully supports the xml data type, as well as most
of the rich XML support that SQL Server provides, including XML Query (XQuery), XML Path
(XPath), and the FOR XML clause. However, XML schema deinitions (XSD) and XML indexes
are not supported in SQL Database.
CHAPTER 3 Differences between SQL Server and Microsoft Azure SQL Database 61
Summary
In this brief chapter, you learned about the important differences between on-premises SQL Server
and SQL Database on Microsoft Azure. We explained the SQL Database limitations on size, as com-
pared to a virtually unlimited database size supported by SQL Server. We also discussed connection
limitations, and important considerations to keep in mind with respect to dropped connections, which
occur with relative frequency in SQL Database. The chapter concluded by enumerating the many SQL
Server features that are either unsupported or have limited support in SQL Database, and offered
workarounds where possible.
The information in this chapter will help you decide whether or not SQL Database is suitable for
your particular scenario. Of course, if you determine that it is not, always remember that you can run
on-premises SQL Server in an Azure VM (we show you how in Chapter 6). This IaaS approach provides
you with full SQL Server functionality in the cloud, compared to the PaaS approach of going with
SQL Database.
Migrating databases
— Eric Boyd
In our experience helping customers develop and migrate applications to Microsoft Azure, there has
always been a need to migrate data along with those applications, even for “all new development”
projects. So you really need to know about the solutions that are available for migrating your data to
SQL Database and to understand their strengths and weaknesses. In this chapter, you will work with
multiple tools and techniques for migrating data to SQL Database, including Transact-SQL (T-SQL)
scripts, SQL Data-Tier Applications (BACPAC), bulk copy (bcp), and the SQL Database Migration
Wizard.
Note As mentioned in Chapter 1, “Getting started with Microsoft Azure SQL Database,”
and practiced throughout this book, the term SQL Database refers speciically to Microsoft
Azure SQL Database in the cloud, whereas the term SQL Server refers speciically to local
(on-premises) SQL Server.
In addition to the tools and techniques discussed in this chapter, there are many other solutions
available from both Microsoft and third-party vendors that might also it your scenario and require-
ments. For example, SQL Server Integration Services (SSIS) is a great solution if you need to import
data from data sources beyond SQL Server, like Excel spreadsheets, or other database platforms like
Oracle. If you are starting with an existing database in SQL Database and you want to apply incre-
mental changes and updates to your database, third-party tools like Red-Gate SQL Compare and
Data Compare are also good solutions. You should explore these and all other available solutions to
help you migrate data from your on-premises data stores and database servers to SQL Database. You
need to understand the capabilities and limitations of each option so that you can effectively choose
a solution that best its your scenario.
The existing code and data we manage (often referred to as legacy, even if it was born in the last
year) drives us to consider migration strategies when evaluating the public cloud and Microsoft Azure.
In this chapter, we demonstrate various ways to move data into SQL Database from existing legacy
systems and on-premises SQL Server servers, and we discuss other things to consider when migrating
data to SQL Database.
In the next section, you will use SQL Server Management Studio (SSMS) to write T-SQL scripts that
create and populate a local SQL Server database. Note that you can also use SQL Server Data Tools
(SSDT) inside Microsoft Visual Studio to build and run T-SQL scripts. (You will learn much more about
SSDT in Chapter 10, “Building cloud solutions.”) Also, note that all the scripts in this chapter can be
downloaded from the book’s companion website. (See the Introduction for details.)
If you have access to a SQL Server instance that you can create a local database on, you can use
that SQL Server instance. Otherwise, you will need to install the SQL Server Express edition to host
the database on your local machine. A step-by-step procedure for doing so can be found in the
Introduction, in the section “Installing the SQL Server Express edition.”
Note This chapter assumes you are using the SQL Server Express edition for your local SQL
Server database, which has a server instance name of .\sqlexpress. If you are using another
edition, you must replace the instance name .\sqlexpress speciied in the instructions with
the name of server instance you are using. For example, if you are running a primary in-
stance of the SQL Server Developer edition on your local machine, you can simply specify
the dot (.) symbol, or localhost. If you are running a named instance on your local machine,
append a backslash followed by the name of the instance (for example, .\myinstance or
localhost\myinstance).
USE WineDb
GO
To create the WineDb database using this T-SQL, follow these steps:
1. Launch SSMS. An easy way to do this is to press the Windows key, type sql server
management studio on the Start screen, and press Enter.
2. In the Connect To Server dialog, connect to your local SQL Server instance using the
appropriate credentials, as shown in Figure 4-1.
3. Once you are connected, your SQL Server instance will be listed in the Object Explorer pane.
Right-click on your SQL Server instance, and choose New Query as shown in Figure 4-2. This
opens a new query window.
FIGURE 4-2 The New Query context menu option in SSMS Object Explorer
4. Type the code shown in Listing 4-1 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
5. Press F5 (or click the Execute button in the toolbar) to run the script.
6. Expand the Databases node beneath your SQL Server instance in Object Explorer (or, if it’s
already expanded, right-click it and choose Refresh). The WineDb database now appears.
7. Expand the WineDb database node, and then expand the Tables node beneath it to view the
Customer, Order, and Wine tables.
There is now a WineDb database running on your local SQL Server instance. This is the source
database you will migrate to Microsoft Azure using various tools and techniques throughout the rest
of this chapter.
Listing 4-2 shows the T-SQL script you will run in the next procedure. This script populates
the Wine table with 15 rows of data and the Customer table with 3 rows of data. Notice how the
IDENTITY_INSERT setting is turned on before inserting rows into a table, and then turned off again
after. Turning this setting on allows the script to provide explicit values for each new record’s primary
key, which would normally be assigned automatically by SQL Server because the primary keys were
designated with IDENTITY. (Refer to Listing 4-1 earlier.)
1. Open a new query window in SSMS (or delete all the code in the same query window you
used in the previous procedure).
2. Type the code shown in Listing 4-2 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
3. Press F5 (or click the Execute button in the toolbar) to run the script.
You might be thinking there must be a better way, and of course, there is. SSMS can examine the
database and generate a T-SQL script with INSERT statements for all the data in the tables. In this
next procedure, you will use SSMS to automatically generate a T-SQL script from your local WineDb
SQL Server database that you can then use to populate your WineCloudDb SQL Database, effectively
migrating the data from SQL Server to SQL Database.
1. If you’ve closed SSMS since the previous procedure, start it up again and connect to your local
SQL Server instance that contains the WineDb database.
2. In Object Explorer, expand the node for your SQL Server instance name.
3. Beneath your SQL Server instance name, expand the Databases node to display the list of
databases.
4. If the WineDb database does not appear, right-click the Databases node and choose Refresh.
5. Right-click on the WineDb database, and choose Tasks | Generate Scripts. This launches the
Generate And Publish Scripts wizard.
6. On the Introduction page, click Next to display the Choose Objects page.
7. On the Choose Objects page, you have the option of scripting the entire database or selecting
speciic objects you want to script. This is not limited to tables; it can also include other data-
base objects, such as views, stored procedures, triggers, and so on. Leave the default option
selected to script the entire database, and click Next to display the Set Scripting Options page.
8. On the Set Scripting Options page, click the Advanced button to display the Advanced
Scripting Options dialog.
10. Scroll down to the Types Of Data To Script property. (It’s the last property in the General
category.) By default, this option is set to script only the database schema. You can also
choose to script only data or both schema and data. In our current scenario, we only want to
script the data, so choose Data Only, as shown in Figure 4-3.
15. Click Finish to generate the script, which is then displayed in a new query window, as shown
in Figure 4-4.
At this point, you have created a WineDb database, with both schema and data, in your local
SQL Server instance. You created T-SQL scripts by hand that you executed to create the schema and
insert data into your local SQL Server database. You also learned how to generate these T-SQL scripts
using SQL Server Management Studio. The focus of this section was to set up the source database
that will be migrated to SQL Database throughout the rest of this chapter, but you can execute this
same T-SQL script in SQL Server and SQL Database. To execute this script and populate a SQL Data-
base instance, connect your SSMS query window to the SQL Database instance instead of the local
SQL Server database, and execute the T-SQL script in that window. You can also execute these T-SQL
scripts using the SQL Database management portal as mentioned in the “Creating a SQL Database
instance” section found in Chapter 1.
When migrating a database from on-premises SQL Server to SQL Database, you often want to
migrate your data along with your database and instance objects, and that is when BACPAC (.bacpac
iles) becomes useful. BACPAC is similar to DACPAC, but in addition to the database objects (schema),
it also includes the actual data from the database in the package.
Microsoft Azure Storage accounts authenticate access using one of two 512-bit storage access keys
(a primary and a secondary). These keys are automatically generated for you when you create a stor-
age account. You can regenerate these keys at any time in the Microsoft Azure management portal
(and via the Microsoft Azure Service Management API). To help keep your storage account secure, it is
recommended that you regenerate your access keys periodically. Changing authentication credentials
to services that other services and applications depend on without causing downtime can be chal-
lenging. Microsoft Azure simpliies this by providing the two access keys, which allows you to rotate
access keys without causing downtime.
To create a Microsoft Azure Storage account you can use to upload and store your .bacpac ile,
follow these steps:
5. In the data entry area to the right of the QUICK CREATE link, do the following:
a. For URL, type mywinestorage. This will be the name of your storage account. (It can be
any name from 3 to 24 lowercase letters and numbers.) This must be a globally unique
name, so you’ll need to choose something other than mywinestorage if the portal informs
you that the speciied storage account name is already in use (which is very probable).
b. For LOCATION/AFFINITY GROUP, select the Microsoft Azure data center where you want
to create your storage account from the drop-down list. This should be the same data
center that hosts your SQL Database server. (See Chapter 2, “Coniguration and pricing,”
to understand the pricing implications of choosing a data center location.)
c. For REPLICATION, leave the default Geo-Redundant setting, which enables geo-
replication. This synchronizes a copy of your data with another Microsoft Azure data
center, to enable recovery in the event of a data center disaster. (Again, we say more on
this in Chapter 2.) The portal should appear similar to Figure 4-5.
6. Click CREATE STORAGE ACCOUNT to start provisioning the new storage account. In a few
moments, you will see it appear in the portal with an Online status, as shown in Figure 4-6.
FIGURE 4-6 Viewing the new storage account in the management portal
7. Click on the storage account name (mywinestorage, or whatever name you assigned in the
previous procedure).
FIGURE 4-7 Viewing the primary and secondary access keys generated for the new storage account
9. Click the copy button to the right of the PRIMARY ACCESS KEY text box (the icon that looks
like two documents) to copy the primary access key to the clipboard. You will paste this key in
a later step, so be sure not to copy anything else to the clipboard until then.
10. If you are prompted by the browser to permit clipboard access, click Allow Access.
11. Click the checkmark icon in the lower-right side of the dialog to return to the Storage Account
home screen.
Now that you have created a storage account, the next step is to create a blob container for it.
Then you will be able to upload a .bacpac ile to the blob container within the storage account, and
inally import the .bacpac ile to SQL Database. To create the blob container, follow these steps:
1. Click the CONTAINERS link at the top of the page, as shown in Figure 4-8.
2. Click the ADD button at the bottom of the page to display the New Container dialog.
4. For ACCESS, leave the default Private setting, which ensures that only the account owner (you)
can access the new container.
5. Click the checkmark icon in the lower-right side of the dialog to create the container. When
the process is complete, you’ll see a notiication at the bottom of the portal.
1. If it’s not still opened from an earlier procedure, launch SSMS and connect to your local SQL
Server instance that contains the WineDb database.
2. In the Object Explorer, expand the node for your SQL Server instance name.
3. Beneath your SQL Server instance name, expand the Database node to display the list of
databases.
4. Right-click on the WineDb database, and choose Tasks | Export Data-Tier Application, as
shown in Figure 4-9. This launches the Export Data-Tier Application wizard.
5. On the Introduction page, click Next to advance to the Export Settings page.
6. In the Settings tab on the Export Settings page, click the Save To Microsoft Azure radio
button.
Tip If you don’t want to include the entire database in the .bacpac ile, you can
choose just the database objects you want to export in the Advanced tab of the
Export Settings page.
7. Click the Connect button to launch the Connect To Microsoft Azure Storage dialog.
8. For Storage Account, type mywinestorage (or whatever globally unique name you assigned
to the storage account when you created it). Notice that the HTTPS check box at the bottom
of the dialog gets selected automatically when you type the account name. This is expected,
and you should leave it selected.
10. Click the Connect button. The wizard connects to the storage account and returns to the
Export Settings page, where the Container drop-down list has now become enabled.
11. For Container, select dbimport from the drop-down list. (This is the container you created in
the previous procedure.) The wizard should now appear as shown in Figure 4-11.
13. Review the Summary page. If everything looks correct, click Finish to begin the export.
Once the export has inished successfully, the wizard displays the Results page with a list of all the
tasks it completed. The .bacpac ile has now been exported and uploaded to your Microsoft Azure
Storage account. You can now click Close to close the Export Data-Tier Application wizard.
3. If you have a WineCloudDb database in your list of databases from previous chapters, delete it
now:
a. Click on the any column to the right of the Name column to select the WineCloudDb
database. (Don’t click on the database name itself.)
d. If that was the only database on the SQL Database server, you will also be asked if you
also want to delete the server. Because you are going to import your .bacpac ile into a
new database on this server, click NO.
e. Click OK to dismiss the notiication message that the database was deleted.
5. Click IMPORT, as shown in Figure 4-12. This displays the IMPORT DATABASE dialog box.
6. Click the folder icon to the left of the BACPAC URL text box. This opens the BROWSE CLOUD
STORAGE dialog box.
7. An explorer tree that displays your Microsoft Azure Storage accounts and their containers
appears on the left side of the dialog. Expand your storage account to display the dbimport
container inside of it.
8. Click the dbimport container to display its contents on the right. You can see the
WineDb.bacpac ile you recently uploaded to the container, as shown in Figure 4-13.
FIGURE 4-13 The contents of the dbimport container displayed in the BROWSE CLOUD STORAGE dialog
11. For NAME, change the database name from WineDb (the name of the local SQL Server
database, which was discovered from the .bacpac ile that the database was exported to) to
WineCloudDb, which is the actual name you want to give the new SQL Database instance.
12. For SERVER, choose any available server from the drop-down list to host the database (or
choose New SQL Database Server from the drop-down list to create a new server on the
ly). Once you choose a server, the SERVER LOGIN NAME and SERVER LOGIN PASSWORD
text boxes appear. The SERVER LOGIN NAME text box is automatically populated with your
administrator login name.
Important If you create the storage account and the SQL Database server in
different regions, you will incur additional Microsoft Azure billing charges for
network bandwidth between the two regions. In this event, the Microsoft Azure
management portal will alert you with a warning message. See Chapter 2 for
more information on pricing for Microsoft Azure.
13. For SERVER LOGIN PASSWORD, type the password for your login. The IMPORT DATABASE
dialog should appear similar to Figure 4-14.
FIGURE 4-14 Importing a .bacpac ile from Microsoft Azure Storage into a new SQL Database instance
14. Click the checkmark icon in the lower-right side of the dialog to begin the import. Once the
import has completed, a notiication that the import was successful appears at the bottom of
the page, and the new WineCloudDb database appears in the list of SQL databases. (Some-
time, it is necessary to refresh the page by pressing F5 to get the new database to appear.)
Using BACPAC Data-Tier Applications is one of the simplest ways to migrate both the database
schema and data to a SQL Database instance. You can also use DACPAC Data-Tier Applications to
migrate only the schema, if that meets your migration requirements. One of the things that makes
SQL Data-Tier Applications easy to work with is that you can use familiar tooling, including SQL
Server Management Studio (SSMS) and SQL Server Data Tools (SSDT) in Visual Studio. As you just
saw, you can perform a database migration entirely using only the familiar SQL Server tools and the
Microsoft Azure management portal, which makes this technology accessible for almost all Microsoft
developers.
You should be aware of one limitation of using BACPAC: you cannot import a .bacpac ile into an
SSDT database project. (Chapter 10 covers SSDT database projects.) If you need to make any schema
modiications between exporting your database and importing your database into SQL Database,
you cannot achieve this with a .bacpac ile. Instead, you can import .dacpac iles (which contain only
schema information) into SSDT database projects, but you would not have the data in your package
using a DACPAC. Because of feature limitations and syntax differences between SQL Server and SQL
Database, you will often need to make schema changes before deploying your databases to SQL
Database. As a result, you will either need to extract .dacpac iles without data or make any necessary
schema and syntax changes to your local SQL Server database prior to exporting your .bacpac iles.
In this section, you will learn how to use the bcp utility to export data iles from a source SQL
Server database and import them into a destination SQL Database instance.
For the purposes of this exercise, you will drop and re-create the WineCloudDb tables populated
by the BACPAC migration you performed in the previous section. To do this, you will use the SQL Da-
tabase management portal to run the T-SQL script shown in Listing 4-3. (Remember, though, you can
also run T-SQL scripts against a SQL Database instance using any of the familiar locally installed tools,
such as SSMS or SSDT.) You’ll notice that this script is almost exactly the same as the one in Listing 4-1
that you used to create a new local SQL Server database at the start of the chapter. The only differ-
ence is that this T-SQL script starts with three DROP TABLE statements that delete the existing tables
(populated by the BACPAC migration you performed in the previous section), which are then re-
created as empty. This has the net effect of migrating just the schema of a database without any data.
LISTING 4-3 T-SQL script to drop and re-create the local WineCloudDb tables
3. Click the WineCloudDb database. (This is the database you imported from a BACPAC ile in the
previous section.)
5. Scroll the page down a bit, ind the MANAGE URL link in the “Quick Glance” section at the
right of the page, and click the link. This opens a new browser tab to the SQL Database
portal’s login page.
Note The SQL Database portal is Silverlight-based. If you don’t have Silverlight
installed, you will irst be prompted to download it before you can use the portal.
6. For USERNAME and PASSWORD, type the administrator login name and password for the
server, respectively, and click Log On.
7. Click the New Query button in the toolbar at the top of the SQL Database management portal
to open a blank query window.
8. Type the code shown in Listing 4-3 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
The WineCloudDb SQL Database instance now has empty Wine, Customer, and Order tables that
are ready for migration with bcp. Before running your irst bcp command, it’s a good idea to become
acquainted with bcp syntax. Table 4-1 shows the common bcp parameters that need to be speciied
for a typical import or export operation.
Parameter Description
source | target Database Object For export, it speciies the table, view or T-SQL query to be used as the source
for the export operation. For import, it speciies the table to be used as the target for the import
operation.
in | out | queryout BCP Operation To import data into a table or view, specify in. To export data to a data ile from
a table or view, specify out. To export data to a data ile from a query, specify queryout.
data ile Data File For export, it speciies the name of the data ile to create from the table, view, or
T-SQL query. For import, it speciies the name of the data ile to retrieve data for the table being
imported. This parameter must include the full path to the data ile.
–S server Server Name Speciies the server name of the SQL Server or Microsoft Azure SQL Database that
bcp should connect to.
–T Windows Authentication Use a trusted connection that doesn’t require a user name and
password. It cannot be combined with –U and –P. Trusted connections are supported only for SQL
Server. When connecting to SQL Database, you must use –U and –P for SQL Server authentication
instead.
–U login SQL Server Authentication Login Combine with –P to connect using SQL Server
authentication with either SQL Server or SQL Database. Cannot combine with –T.
–P password SQL Server Authentication Password Combine with –U to connect using SQL Server
authentication with either SQL Server or SQL Database. Cannot combine with –T.
–n Use Native Data Types Recommended when migrating between SQL Server, SQL Database, or
both. For non-Microsoft databases, this switch is not supported, and bcp will prompt you for the
data type of each column (or you can deine the data types in a separate format ile).
–q Support Quoted Identiiers Allows you to use a database, owner, table, or view name that
contains a space or single quotation mark (executes the SET QUOTED_IDENTIFIERS ON statement).
Note that bcp is very particular about the irst three parameters. The database object, operation, and
data ile parameters must always be speciied in that order. The remaining switch parameters can
appear in any order on the command line.
Exporting data
As you might have already inferred by its syntax, the bcp utility migrates data into and out of
individual tables and not an entire database. The local WineDb database contains three tables: Wine,
Customer, and Order. The Wine and Customer tables both have data, and the Order table is empty, so
you will export data from the database by running the bcp utility twice: once for the Wine table and a
second time for the Customer table.
To export data from your local WineDb database into bcp data iles, follow these steps:
1. Launch a command-prompt window. An easy way to do this is to press the Windows key, type
cmd on the Start screen, and press Enter.
2. Type bcp WineDb.dbo.Wine out Wine.dat –S .\sqlexpress –T –n –q, and press Enter.
Note These instructions assume you are using the SQL Server Express edition,
which has a server name of .\sqlexpress. If you are using another edition, you
must replace the server name .\sqlexpress speciied in the instructions with
the name of server you are using. Furthermore, if your server doesn’t support
Windows authentication, you cannot specify –T, and must instead use the –U and
–P switches for SQL Server authentication.
FIGURE 4-15 Exporting SQL Server tables to data iles with bcp
Importing data
Now you will use bcp once more to import the data iles you just exported into the WineCloudDb SQL
Database instance, only this time you will specify in to perform an import operation.
When importing with bcp, you need to pay attention to the size of the data ile being imported.
If your data set is large, you will likely need to split it up into multiple chunks. You can easily do this
using the –b switch parameter to specify the number of rows to import as one batch. Each batch is
imported and logged as a separate database transaction so that if an error occurs, only inserts from
the current batch are rolled back. By default, bcp imports all rows in a data ile as one batch, but if
you are importing large numbers of rows, you will likely experience connection loss and throttling
from SQL Database if you don’t specify a smaller batch size. You might need to experiment with your
data set to determine the right batch size to avoid throttling and connection loss with bcp.
The bcp syntax provides special switches to support batched import operations and to let you
specify hints that enable other options. These additional switch parameters are shown in Table 4-2.
Parameter Description
–b batch_size Batch Size Speciies the number of rows to process for a batched import operation.
–F irst_row First Row When batching with –b, speciies the starting row in the data ile to use as the starting
point for the import operation.
–L last_row Last Row When batching with –b, speciies the ending row in the data ile to use as the stopping
point for the import operation.
–h hints Hints Enable other options. For example, you can sort and order the data using the ORDER hint,
force constraints to execute during the import operation using the CHECK_CONSTRAINTS hint, and
lock the table during the import operation using the TABLOCK hint.
To import the data iles exported from the local WineDb SQL Server database into the
WineCloudDb SQL Database instance, follow these steps:
1. If it’s not still opened already from the previous export operation, launch a new command
prompt.
Note Replace <server>, <login-id>, and <password> with the server name,
administrator user name, and administrator password of the SQL Database server
hosting your WineCloudDb database.
FIGURE 4-16 Importing a data ile to a SQL Database table with bcp
You have now imported both the Wine and Customer tables from the local WineDb SQL Server
database into the WineCloudDb SQL Database instance using the bcp utility. In our example, we
migrated a very small data set to SQL Database, so it wasn’t really necessary to break the Wine
table up into 3 batches of 15 rows each (but now you’ve learned how). In fact, bcp was designed to
eficiently migrate large amounts of data into and out of SQL Server. So if you have large tables of
data to migrate into and out of SQL Database (and/or SQL Server), this exercise has shown you how to
batch the overall import operation with bcp.
All the database migration tools and solutions we’ve explored to this point are built into the
Microsoft SQL Server tools or Visual Studio. However, there are other nice and useful tools outside of
the commercial Microsoft toolset, and the Microsoft Azure SQL Database Migration Wizard is one of
them. This is a free, open source tool that interactively walks you through the process of migrating
a database to SQL Database. The migration wizard was created by George Huey, a Principal Archi-
tect at Microsoft, back in the early days of Microsoft Azure SQL Database when it was still called SQL
Azure. It has been battle-tested by thousands of users and is often updated with bug ixes and feature
enhancements, often as a result of great community feedback.
Note Even though this tool was created by a Microsoft employee, it is not an oficial
Microsoft product and is not supported by Microsoft.
The Microsoft Azure SQL Database Migration Wizard greatly simpliies migrating databases to SQL
Database by doing these three things very well:
■ Analyzes a SQL Server database, SQL Proiler trace, or T-SQL script for SQL Database
compatibility issues
You already worked with T-SQL scripts and migrated data to SQL Database using BACPAC and bcp.
But one thing you haven’t done to this point is analyze the database for incompatibilities, and that’s
one of the major beneits of the Microsoft Azure SQL Database Migration Wizard.
At the time of this writing, there are two different versions of the SQL Database Migration Wizard.
Version 3.X supports SQL Server 2008 R2, and version 4.X supports SQL Server 2012. We assume you
are running SQL Server 2012, so you should install version 4.X. It’s reasonable to expect that future
versions of the tool will be released to work with future versions of SQL Server, so you just need to
pay attention to which version of the tool you are downloading.
To download and install the SQL Database Migration Wizard, follow these steps:
1. Navigate your web browser to http://sqlazuremw.codeplex.com. This takes you to the tool’s
dedicated Codeplex page. (Note that the URL has a reference to “SQL Azure” in it, because the
tool was created back when SQL Database was named SQL Azure.)
2. Click the DOWNLOADS button at the top of the page. This takes you to a page that lists all the
available SQL Database Migration Wizard downloads.
3. Scroll down to ind and click the download link for SQLAzureMW v4.0.18 Release Binary for
SQL Server 2012 (or, as mentioned, ind and click the Release Binary link with the version
number that corresponds with the version of SQL Server you are running).
5. When prompted, click Open Folder to launch an Explorer window to the location on your
computer where you saved the downloaded .zip ile.
7. In the Properties dialog, click the Unblock button, as shown in Figure 4-17. If you don’t
unblock the .zip ile, you will still be able to extract it, but you won’t then be able to run the
tool from the extracted location.
FIGURE 4-17 Unblocking the downloaded .zip ile using the Properties dialog
10. Click Extract to extract the contents of the .zip ile to a new folder in the same location
and with the same name as the .zip ile.
After the .zip ile is extracted, the folder with the extracted iles opens up automatically in a new
Explorer window, and you are ready to begin using the tool.
Migrating a database
To use the SQL Database Migration Wizard to migrate the WineDb database, follow these steps:
1. In the Explorer window opened to the extracted iles, double-click the ile SQLAzureMW.exe
to launch the tool. This displays the wizard’s Select Process page, as shown in Figure 4-18.
2. In the options on the right, choose the Database radio button beneath Analyze / Migrate,
and click Next. This displays the Connect To Server dialog.
Note The TSQL File radio button is useful if you already previously scripted your
database objects to a T-SQL ile, in which case the tool can also analyze and
migrate using that T-SQL ile.
3. For Server Name, type .\sqlexpress (or the name of your local SQL Server instance that
contains the WineDb database).
4. If your local server requires SQL Server authentication, choose the Use A Speciic User ID
And Password radio button and supply your login name and password.
5. Leave the other options set to their defaults, and click the Connect button at the bottom
of the dialog.
6. The Select Source page now displays a list of the databases installed on the local SQL Server
instance you just connected to. Select the WineDb database, and click Next.
7. The Choose Objects page appears. By default, this page is set to script all the database
objects, but you can select speciic database objects if you want. Leave the default option
selected to script the entire database, and click Next.
9. When prompted to generate the SQL script, Click Yes. This creates a script to generate the
database schema and runs bcp to export the individual tables from the local WineDb data-
base.
10. When processing completes, the wizard displays the Results Summary page, as shown
in Figure 4-19. You should encounter no errors with the WineDb database. However, if there
are errors, this is where you will discover them, because the wizard will refuse to migrate the
database until you resolve the errors.
Note The Results Summary page uses color coding to make it easy for you to
spot problems. Green and blue indicate success, but if there are compatibility
issues, they will show up in either red or dark red. Red indicates an error that pre-
vents migration, which you need to resolve, while dark red text indicates that an
incompatibility was found, but the SQL Database Migration Wizard knows that it
can resolve the issue automatically.
FIGURE 4-19 The Results Summary page after successfully generating and running a script
FIGURE 4-20 The SQL Script tab displays the generated schema creation script
12. If you would like to save the script to a ile for later use or review, click Save and select a
location to save the ile.
13. Click Next to begin coniguring the deployment to SQL Database. This launches the Connect
To Server dialog.
b. For User Name, type <login-id>@<server> (replace <login-id> with the server’s
administrator user name, and replace <server> with the name of your SQL Database
server).
c. For Password, type the type server’s administrator password. The Connect To Server
dialog should appear similar to Figure 4-21.
d. Click Connect to connect to your SQL Database server. This closes the Connect To Server
dialog and returns to the wizard.
15. The Setup Target Server Connection page appears, and lists all the databases on the server. If
you have been following along with the previous procedures, you will see the WineCloudDb
database appear in the list. You want to begin with an empty database, so delete the current
one as follows:
16. Click the Create Database button at the bottom of the dialog. This launches the Create
Database dialog.
17. For Enter Database Name, type WineCloudDb. The Create Database dialog should appear
similar to Figure 4-22.
18. Click the Create Database button. This creates an empty SQL Database named WineCloudDb,
closes the Create Database dialog, and returns to the wizard.
19. Select the newly created WineCloudDb database in the list, and click Next.
21. As the deployment progresses, you will see status updates written to the Target Server
Response page. When the deployment completes successfully, the Target Server Response
page should appear similar to Figure 4-23.
FIGURE 4-23 The Target Server Response page after a successful deployment
You have now deployed both the schema and data to the WineCloudDb SQL Database instance
using an intuitive step-by-step tool, thanks to the Microsoft Azure SQL Database Migration
Wizard. Beyond deploying both your database schema and data, it also analyzed your schema for
compatibility issues when migrating from SQL Server to SQL Database.
1. Generated T-SQL scripts for all the database objects (schema) in the local SQL Server database
3. Analyzed the generated T-SQL script with a pattern matching rules engine that uncovers known
incompatibilities and limitations
4. Deployed the database schema to SQL Database by executing the generated (and potentially
autocorrected) T-SQL scripts
5. Imported data into SQL Database from the exported data iles using bcp
All these steps (with the exception of the analysis step) could have been performed independently,
as you did in the previous sections of this chapter. The SQL Database Migration Wizard just packages
everything up in an easy-to-use tool that visually and interactively walks through the process, without
you needing to use multiple tools and command prompts. But the rules engine analysis that the SQL
Database Migration Wizard conducts on your local database schema is not something you can do
with the other tools. This analysis is a unique and extremely compelling capability of the wizard.
The Microsoft Azure SQL Database Migration Wizard is open source, and you can look at the
internals of this tool if you want. If you discover an incompatibility between SQL Server and SQL
Database that the tool doesn’t catch, or you’re just curious about the predeined syntax rules, you can
easily view the rules. They are deined in an XML ile named NotSupportedByAzureFile.Conig, which
can be found in the same directory as the SQLAzureMW.exe. If you are comfortable with regular
expressions, you can even add your own rules to the SQL Server Migration Wizard by modifying this
XML ile with a text editor.
For lightweight scenarios, you saw how T-SQL scripts can be generated from a SQL Server
database and executed against a SQL Database instance. SQL Data-Tier Application .bacpac iles make
it easy to package an entire database, including both schema and data, and import that into a SQL
Database instance, but it operates at the database level and doesn’t allow you to migrate individual
database objects. Furthermore, for larger databases, the size of the .bacpac ile can make it dificult
to migrate to SQL Database. Bulk Copy with bcp is an eficient and high-performance way to mi-
grate large amounts of data to SQL Database, but it doesn’t do anything to migrate your database
objects (schema). Finally, the Microsoft Azure SQL Database Migration Wizard is a free, open source
project on Codeplex that is not commercially supported by a software vendor that brings together
the process of migrating the schema and data to SQL Database while generating T-SQL scripts and
automating bcp.
T he topics of security, availability, and disaster recovery top the list of concerns that customers
raise when considering the public cloud. These are certainly not new concerns introduced with
the cloud; customers have been architecting solutions to deal with these same concerns since long
before the cloud. The cloud is simply unfamiliar territory that causes these foundational concerns to
be revisited. Thus, customers need these top concerns addressed with reasonable solutions before the
public cloud is a viable option. Microsoft does a great job of putting customers’ concerns at ease on
these topics with the security processes and certiications that are in place in Microsoft Azure, along
with the features of the platform that provide customers with the control and visibility they need.
In this chapter, we discuss security and backup concerns in the cloud. We start by explaining
the general security responsibilities of any public cloud vendor, and then talk more speciically
about security in Microsoft Azure and Microsoft Azure SQL Database. You will learn how to secure
SQL Database by coniguring the irewall as you create custom irewall rules and deine users and
permissions.
Security and backup often go hand in hand. Notwithstanding all other security-related concerns,
how “secure” is your business if you have no backup in the event of an unforeseen disaster? So toward
the end of this chapter, you will also learn how to copy and back up SQL Database, and how to
schedule automated backups.
97
Security responsibilities of the public cloud vendor
Some security concerns can be managed and addressed only by public cloud vendors, because
customer access is limited to higher-level abstractions over the raw computing infrastructure, re-
sources, and services. The customer typically cannot gain direct access to things like network routers,
switches, and irewalls, as well as physical servers and the hypervisor, which is the software layer that
virtualizes the hardware for multiple operating systems to run on a single physical server. As a result,
it is very important to have a reputable cloud vendor with a successful history that you can count
on and trust. But you cannot rely only on faith in a vendor, you also need transparency and insight
into the resources and practices of your cloud vendor, and this includes their security practices and
procedures, as described in the following sections.
Isolating tenants
As is the case with vendor personnel, you don’t want other tenants of the cloud vendor to be able to
gain access to your data and applications. (Multitenancy is an architecture in which a single infrastruc-
ture component serves multiple customers, where each customer is called a tenant.) When you are
using multitenant services, this is a concern that must be managed by the vendor.
Auditing activities
Much like compliance, knowing who did what and when they did it is a responsibility that is shared
between the cloud vendor and the customer. Only the cloud vendor can track and provide an audit
log of the activities that occur in the platform services. But it’s the customer’s responsibility to track
the application-level activities. Because accurate and detailed auditing is a common requirement
for most compliance certiications, it’s an important capability both for your cloud vendor and your
applications to provide. A core requirement for effective auditing requires you to provide unique
credentials for every user and ensure that users do not share their credentials. If multiple users share a
single account, you cannot possibly know exactly who performed an activity logged for that account.
Although Microsoft invests a lot of effort into ensuring SQL Database is a secure and reliable
service, you still need to do a number of things to create a secure experience when using it. In the
following sections, you will walk through step-by-step procedures that help to secure SQL Database.
You will begin by securing access to and communication with SQL Database. Then you will walk
through application-level security concerns such as SQL injection attacks and data encryption.
You will use SSMS to create the database. Then you will execute the script in Listing 5-1 to create
some tables and populate them with some data.
1. Launch SSMS. An easy way to do this is to press the Windows key, type sql server
management studio on the Start screen, and press Enter.
b. For Authentication, select SQL Server Authentication from the drop-down list. (SQL
Database does not support Windows Authentication.)
c. For Login and Password, type the user name (we’ve been using saz) and password you
assigned the server when you created it.
d. Click the Connect button. The server now appears as a node in the Object Explorer.
3. Right-click the server node in the Object Explorer, and choose New Query to open a new
query window connected to the master database.
4. In the new query window, type CREATE DATABASE WineCloudDb and press F5 to execute
the script. This creates a new WineCloudDb database on the server.
5. Expand the Databases node in the Object Explorer. If the new WineCloudDb database is not
visible, right-click the Databases node and choose Refresh.
6. Right-click the WineCloudDb database in the Object Explorer, and choose New Query to open
a new query window connected to the WineCloudDb database.
7. Type the code shown in Listing 5-1 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
9. Close the query window. (You don’t need to save the changes unless you want to.)
SQL Database Firewall restricts access to SQL Database based on the origin IP address of the
connection. It is an opt-in model, which means that by default all connections to SQL Database are
Tip In addition to server-level irewall rules created using the Microsoft Azure
management portal, you can conigure database-level irewall rules using the
sp_set_database_irewall_rule stored procedure found in each database. For more
information on coniguring database-level irewall rules, visit http://msdn.microsoft.com/
en-us/library/jj553530.aspx.
SQL Database server-based irewall rules are deined for an entire SQL Database server, they are
stored in the master database, and they allow clients to connect to any database within that server.
These rules can be edited directly in the master database, and they can also be managed using the
Microsoft Azure management portal. SQL Database irewall rules have a name, a starting IP address,
and an ending IP address. This arrangement allows you to create a single rule for multiple consecutive
IP addresses, for example, and an entire IP subnet. You can also create a rule for a single IP address by
making the starting IP address and ending IP address the same.
Follow these steps to add your IP address to the server-based irewall rules:
3. Click the SERVERS link at the top of the page. This displays a list of your Microsoft Azure SQL
Database servers, as shown in Figure 5-1.
4. In the NAME column, click the server that contains the WineCloudDb database. This opens a
page with links for the selected server.
5. Click the CONFIGURE link at the top of the page. This displays the SQL Database server irewall
coniguration, as shown in Figure 5-2.
6. If you followed similar procedures in Chapter 1, “Getting started with Microsoft Azure SQL
Database,” and Chapter 2, “Coniguration and pricing,” you should already see one rule named
7. In the Allowed IP Addresses section, your current IP address should appear to the right of
CURRENT CLIENT IP ADDRESS. Click the ADD TO THE ALLOWED IP ADDRESSES link to the
right of your IP address to add it now.
With this change, you will be able to connect to the SQL Database server from your current IP ad-
dress, wherever you happen to be. Of course, if you are connecting from the same IP address you used
in Chapters 1 and 2, the new rule you just added is the same rule as the one you just removed in the
previous step. In this case, the purpose of this exercise was solely to demonstrate how to delete a rule.
New irewall rules don’t take effect until you click the SAVE button at the bottom of the page.
Before you do that, add another rule so that you can also connect from your home ofice. To create
the home ofice rule, follow these steps:
1. In the RULE NAME text box beneath the list of existing rules, type Home Ofice. Note that the
rule name cannot contain either forward slash (/) or backslash (\) characters, nor can it end
with a period (.) character.
2. In the START IP ADDRESS and END IP ADDRESS boxes, type the IP address range of your home
ofice. (Type the same IP address in both text boxes to specify a single IP address rather than a
range.) The page should appear similar to Figure 5-3.
FIGURE 5-3 New irewall rules for the current location and the Home Ofice
You have now updated the irewall rules. Prior to doing so, the home ofice IP address or addresses
you speciied would be blocked from connecting to your SQL Database server using any mechanism,
including SQL Server Management Studio (SSMS), SQL Server Data Tools (SSDT), your own custom ap-
plications using ADO.NET or Entity Framework, and even the SQL Database management portal itself.
When you create a server, you’ll see a check box you can select to enable Microsoft Azure services
to access your SQL Database server. (You used this check box in Chapter 1; see Figure 1-5.) This check
box is selected by default, and if you left it selected when you created your SQL Database server (and
as we instructed in Chapter 1), Azure services are allowed to connect to your SQL Database server.
Once the server has been created, you can easily toggle the setting to allow or block Azure services
through the irewall by clicking the YES and NO options for WINDOWS AZURE SERVICES beneath
Allowed Services at the bottom of the irewall rules page. (See the bottom of Figure 5-3.)
When you choose to allow Microsoft Azure services, a irewall rule is added to your SQL Database
server with an IP range of 0.0.0.0 to 0.0.0.0. This is a special range that allows all Microsoft Azure
services to connect to your SQL Database server, and it does not appear with your other rules on the
irewall rules page. (You can determine whether or not the 0.0.0.0 rule is in place based on whether
YES or NO is selected for WINDOWS AZURE SERVICES at the bottom of the page.)
In addition to managing server-based irewall rules in the Microsoft Azure management portal as
you’ve just done, the SQL Database Management REST API can also be used in scenarios where you
want to manage the SQL Database irewall from an application or script. (The REST API is explained in
Chapter 8, “Designing and tuning for scalability and high performance.”)
1. Launch SSMS. An easy way to do this is to press the Windows key, type sql server
management studio on the Start screen, and press Enter.
2. In the Connect To Server dialog, enter the appropriate credentials, as shown in Figure 5-4. Be
sure to choose the server that has the WineCloudDb database on it. Also, remember that the
server name must be sufixed with .database.windows.net and you must provide the login you
speciied when you created the server. (We’ve been using saz in our examples.)
FIGURE 5-4 The Connect To Server dialog in SQL Server Management Studio
4. Once connected, your SQL Database server will be listed in the Object Explorer pane.
Right-click on the server name, and choose New Query as shown in Figure 5-5. This opens a
new query window connected to the master database.
Note Replace <Password> with a strong password that you select for your new
login. The password must satisfy the requirements of the password policy. For
more information about the strong password policy, see http://msdn.microsoft.
com/en-us/library/ms161962.aspx.
6. Press F5 (or click the Execute button in the toolbar) to run the script.
You have now created a new SQL Database server login; however, this login isn’t authorized to do
anything with the SQL Database server. You now need to create a user for this login and then either
grant the user server-level or database-level permissions.
loginmanager This role has permissions to create logins in the SQL Database server, similar to the
securityadmin role in SQL Server.
dbmanager This role has permissions to create databases in a SQL Database server, similar to the dbcreator
role in SQL Server.
As when creating and managing logins, you will need to manage users and role assignments using
T-SQL scripts, because the SQL Server Management Studio user interfaces are not available when
you are connected to a SQL Database server. Working in the same query window that’s connected to
1. Delete the T-SQL code in the query window left over from the previous procedure.
3. Press F5 (or click the Execute button in the toolbar) to run the script. This code creates
a new user named WineCloudDbUser that is associated with the existing login named
WineCloudDbLogin that you created in the previous procedure, and then adds the new user
to the loginmanager and dbmanager roles.
You have now created a new user in the master database named WineCloudDbUser and granted
this user permissions to manage logins and databases within the scope of the SQL Database server.
This will allow you to delegate permissions to others to manage the SQL Database server without
needing to distribute the server-level principal credentials.
When using SQL Database, you don’t want to grant all users server-level permissions; instead, you
want to give users just enough permissions to do what they need. For example, if you have an appli-
cation that only needs to read data from a database, you don’t want to give the user the application
is connecting as permissions to write data. In this case, you can create a new user who has read-only
database permissions. This involves creating a new login in the master database and then creating a
new user associated with that login in the WineCloudDb database. Create the new read-only user now
by following these steps:
1. Delete the T-SQL code in the query window left over from the previous procedure.
(Remember that this query window is still connected to the master database.)
Note Once again, replace <Password> with a strong password for the new login.
4. In Object Explorer, expand the Databases node to reveal the list of databases on the server
(which should include the WineCloudDb database).
5. Right-click the WineCloudDb database, and choose New Query, as shown in Figure 5-6. This
opens a new query window connected to the WineCloudDb database.
FIGURE 5-6 Opening a new query window for the WineCloudDb database
6. Type the following T-SQL statements into the code window connected to WineCloudDb:
CREATE USER WineCloudDbReadonlyUser FROM LOGIN WineCloudDbReadonlyLogin
GO
EXEC sp_addrolemember 'db_datareader', 'WineCloudDbReadonlyUser'
You have now created a new login in the master database of your SQL Database server. You
created a user for that login in the WineCloudDb database and granted that user permissions to read
data. Granting database-level permissions enables you to grant least-privilege permissions to users
and minimize the attack surface of your SQL Database server.
This error occurs because, by default, SSMS attempts to access the master database when it
connects, but the login provided has no permissions on the master database—it just has read-only
permissions on WineCloudDb. The resolution is to use the advanced version of the Connect To Server
dialog, which allows you to specify a particular database that you want to access other than master,
which in our case, is WineCloudDb. To use the advanced Connect To Server dialog in SSMS, follow
these steps:
1. If you’ve closed SSMS since the previous procedure, start it up again to display the Connect
To Server dialog. If SSMS is still open, click Connect | Database Engine in the Object Explorer
toolbar menu as shown in Figure 5-8 to display the Connect To Server dialog.
2. For Server name, type the name of your SQL Database server that contains the WineCloudDb
database. (Remember to add the sufix .database.windows.net.)
5. For Password, type the password you assigned to WineCloudDbReadonlyLogin in the previous
procedure.
7. In the Connect To Database text box, type WineCloudDb as shown in Figure 5-9.
FIGURE 5-9 The advanced Connect To Server dialog lets you specify a particular database for the
connection
8. Click the Connect button. Now, instead of displaying the error message shown in Figure 5-8,
SSMS connects successfully.
You have now connected to the WineCloudDb SQL Database using SSMS with a limited access
account that has only db_datareader permissions. As explained, this is useful when you need to give a
team member read-only access to a production database to diagnose issues or conduct some analysis
on the data.
Microsoft Azure SQL Database provides security capabilities that include irewall and server-level
roles. You can also secure SQL objects with database-level roles and permissions in just the same way
as you can with SQL Server. This enables you to follow the principle of least privilege and grant users
only the permissions they need, no more and no less. Doing so reduces the attack surface and helps
maintain security. It is a good practice to follow this principle when developing applications that con-
nect to your SQL Database, and when providing other team members credentials for SQL Database to
use for development, troubleshooting, and analysis.
When developers are initially introduced to Microsoft Azure SQL Database, they often mistakenly
think that backups are not needed because SQL Database provides high-availability features by de-
fault. The need for backups is similar but different than the need for high availability. High-availability
capabilities help ensure that your database is accessible when small-scale infrastructure downtime
occurs—for example, when there is excessive load on a server or when a server needs to be rebooted
during routine maintenance. High availability does not help when something unexpected deletes or
corrupts the database, because those changes will get replicated across all high-availability nodes.
So you need a backup-and-restore strategy even with the built-in, high-availability features of SQL
Database.
The backup-and-restore process in Microsoft Azure SQL Database is different than what you might
be used to in SQL Server, because the traditional T-SQL statements BACKUP and RESTORE are not
supported. Instead of traditional backups in SQL Server, BACPAC iles are used to back up and restore
with SQL Database.
Copying a database
Transactional consistency is important to maintain when backing up a transactional system like SQL
Database or SQL Server. BACPAC iles do not provide transactional consistency, because a BACPAC
is by copying tables individually, and modiications could occur between the time that the irst table
and last table are copied. So the irst thing you need to do when backing up a SQL Database server is
create a copy of the database that isn’t being modiied using the Database Copy feature.
The Database Copy feature creates a new database from an existing SQL Database that is
transactionally consistent when the copy inishes. It does this by replicating any changes that are
made to the source database while the database is copying at the end of the process. Database copies
can be created either on the same SQL Database server or on a different server within the same
region.
1. In SSMS, connect to your SQL Database server that contains the WineCloudDb database using
the login you used to create the server. (We’ve been using saz in our examples.) Once con-
nected, the SQL Database server will be listed in the Object Explorer pane on the left.
4. Press F5 (or click the Execute button in the toolbar) to run the script. This starts the process of
copying the WineCloudDb database to a new database named WineCloudDbCopy.
You have now started copying your WineCloudDb database to a new database named
WineCloudDbCopy. In this procedure, you copied a source database to a destination database on the
same SQL Database server, but you could also copy your database to another SQL Database server,
as long as the destination database server is within the same Microsoft Azure region as the source
database server. You can do this by executing the CREATE DATABASE statement in a query window
connected to the master database on the destination server. Then just preix the source database
name in the CREATE DATABASE…AS COPY OF statement with the name of the source server—like so,
for example:
While the database is copying, the state_desc column of the sys.databases view will return
COPYING. If the copy process fails, the state_desc column returns SUSPECT. And if the copy completes
successfully, the state_desc column returns ONLINE.
The new destination database gets created early in the database copy process. If a failure occurs
at any time during the database copy, the database will be left in an incomplete state and you will
need to delete the new database using the DROP DATABASE statement. You can also cancel the
database copy operation while it is running by executing the DROP DATABASE statement on the new
destination database.
1. In SSMS, connect to your SQL Database server that contains the WineCloudDbCopy database
you created in the previous procedure, using the login you used to create the server. (We’ve
been using saz in our examples.) Once connected, the SQL Database server will be listed in the
Object Explorer pane on the left.
2. Right-click on the server name, and choose New Query as shown earlier in Figure 5-5. This
opens a new query window connected to the master database.
4. Press F5 (or click the Execute button in the toolbar) to run the script. This returns the state
of the new destination database from the sys.databases view, as shown in Figure 5-10. Of
course, if you allowed enough time for the copy operation to complete since starting it in the
previous procedure, the state_desc column will report ONLINE, not COPYING.
FIGURE 5-10 The results from the sys.databases query during a database copy operation
You can obtain additional details about the copy operation (start date, completion percentage,
error details, and more) by joining the sys.databases view with the sys.dm_database_copies view on
the database_id column as follows:
SELECT *
FROM sys.dm_database_copies AS c INNER JOIN sys.databases AS d ON c.database_id = d.database_id
WHERE d.name = 'WineCloudDbCopy'
Note The sys.dm_database_copies view will return a result only while the copy is in
progress. Once the copy has completed, this view returns no results.
The database copy operation can also be monitored in the Microsoft Azure management portal,
as shown in Figure 5-11.
Once the operation completes, you have created a transactionally consistent copy of your
database, and you are now ready to export it as a BACPAC.
Exporting a BACPAC
As you learned in Chapter 4, “Creating a Microsoft Azure Storage account,” SQL Database provides
BACPAC import and export capabilities that enable you to easily migrate databases between SQL
Database and SQL Server. These import and export capabilities also provide a simple and reliable way
to back up and restore databases in SQL Database (provided that you irst create a transactionally
consistent copy).
The storage service you use in Microsoft Azure to store BACPAC iles is Microsoft Azure Blob
Storage. Blob Storage is a service designed for storing binary iles that are very large in size. The
service supports two types of blobs: block blobs and page blobs, either of which can be used for
storing BACPAC iles. At the time of this writing, the maximum size of a block blob is 200 gigabytes
(GB) and the maximum size of a page blob is one terabyte (TB).
Blob Storage is an ideal service for storing BACPAC iles. To use the SQL Database export feature,
you need a Microsoft Azure Storage account. If you haven’t already set up a Microsoft Azure Stor-
age Account named mywinestorage, follow the steps found in Chapter 4 in the section “Creating a
Microsoft Azure Storage account.”
4. Click the EXPORT button at the bottom of the page to display the Export Database.
6. For BLOB STORAGE ACCOUNT, choose your blob storage account from the drop-down list
(or choose Create A New Storage Account from the drop-down list to create a new storage
account on the ly). Once you choose a storage account, the CONTAINER drop-down list
appears.
Tip Best practice is to create the storage account in the same Microsoft Azure
region as the SQL Database to avoid data-transfer costs between regions. See
Chapter 2 for more information on SQL Database pricing and recommendations.
7. For CONTAINER, you can choose an existing container from the drop-down list (such as the
dbimport container you created in Chapter 4), or you can create a new one. For this exercise,
choose Create A New Container from the drop-down list.
9. For SERVER LOGIN NAME, the text box is automatically populated with your server-principal
user. (We’ve been using saz in our examples.)
10. For SERVER LOGIN PASSWORD, enter the password for your login. The EXPORT DATABASE
dialog should appear similar to Figure 5-12.
FIGURE 5-12 The Export Database Settings dialog in the Microsoft Azure management portal
You have now exported your WineCloudDbCopy database to a BACPAC ile in Blob Storage. The
BACPAC ile provides a portable backup of your SQL Database that you can archive or restore to
another SQL Database on Microsoft Azure, or to a local SQL Server database in your own data center.
You can download your exported BACPAC ile from the Microsoft Azure management portal or a
third-party storage client by browsing to the ile in your storage account and container.
Importing a BACPAC
Creating a backup is not very valuable unless you can also restore from it. SQL Database makes
restoring a BACPAC ile simple using the Import Database feature.
1. If you’ve closed the Microsoft Azure Management Portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
4. Click IMPORT, as shown in Figure 5-13. This displays the IMPORT DATABASE dialog.
FIGURE 5-13 Restoring a database by using the Import option in the management portal
5. Click the folder icon to the left of the BACPAC URL text box. This opens the BROWSE CLOUD
STORAGE dialog.
7. Click the dbbackups container you used in the previous Export procedure to display its
contents on the right. You can see the BACPAC ile that got created by the Export operation,
as shown in Figure 5-14.
FIGURE 5-14 The contents of the dbbackups container displayed in the BROWSE CLOUD STORAGE dialog
9. Click the Open button. This returns you to the IMPORT DATABASE dialog with the URL of the
.bacpac ile populated in the BACPAC URL text box.
11. For SERVER, choose any available server from the drop-down list to host the database. (Always
keeping in mind that this should be a server in the same region as the storage account to
avoid bandwidth costs.) For this exercise, you can just restore the BACPAC ile to the same
server you exported the WineCloudDbCopy database from, but you can also choose another
existing server from the drop-down list. (You can also choose New SQL Database Server from
the drop-down list to create a new server on the ly.) Once you choose a server, the SERVER
LOGIN NAME and SERVER LOGIN PASSWORD text boxes appear, and the SERVER LOGIN
NAME text box is automatically populated with your administrator login name.
FIGURE 5-15 Importing a BACPAC ile from Microsoft Azure Storage into a new SQL Database server.
13. Click the checkmark icon in the lower-right side of the dialog to begin the import. Once the
import has completed, a notiication that the import was successful appears at the bottom of
the page, and the new WineCloudDbRestored database appears in the list of SQL Database
instances.
You have now created a database by importing a previously exported BACPAC ile. Note that you
can also restore your BACPAC to any SQL Database server, including SQL Database servers in other
subscriptions, by uploading it to a storage account in the target subscription.
As we explained, BACPAC iles that you create manually do not provide transactional consistency
unless you irst use the database copy feature to create a transactionally consistent database copy,
and then create the BACPAC from the copy. Fortunately, when you schedule a BACPAC export sched-
ule to back up a database automatically, the generated BACPAC is transactionally consistent. Thus,
automated BACPAC exports provide an effective and reliable backup strategy for SQL Database.
1. If you’ve closed the Microsoft Azure Management Portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
5. For EXPORT STATUS under Automated Export, click the AUTOMATIC button to display the
automated export coniguration.
7. For FREQUENCY, conigure how often you want the database exported by the entering the
export interval in days (with the default being every 7 days) and the start date of the export in
UTC time (with the default being today at midnight).
8. For RETENTION, enter the number of days to keep the exports. (The default is 30 days.) Leave
the Always Keep At Least One Export File check box selected to guarantee that at least one
export will always be retained ever after the retention period expires for all exports.
9. For SERVER LOGIN NAME, type the login you used to create the server. (We’ve been using saz
in our examples.)
You have now successfully conigured an automated backup schedule for the WineCloudDbCopy
database.
SQL Database provides good security and disaster-recovery features. In this chapter, you learned
how to add security at multiple levels, including IP security policies using SQL Database Firewall and
authentication and authorization using SQL Database logins, user, and roles. You also learned about
the backup and restore capabilities provided by SQL Database, which helps you simplify disaster-
recovery planning by creating transactionally consistent database copies, exporting and import-
ing BACPAC iles, and automating transactionally consistent BACPAC export schedules. Using these
capabilities of SQL Database, you can create a secure, reliable, and highly available solution using
Microsoft Azure and SQL Database.
Cloud reporting
— Leonard Lobel
I n the database world, getting information into a database is only part of the job. Another major part
is extracting meaningful information, strategically analyzing the data you collected, and making
valuable sense of it. To that end, an exhaustive suite of Business Intelligence (BI) services and com-
ponents have evolved and integrated themselves as part of the on-premises Microsoft SQL Server
product. Of these, SQL Server Reporting Services (SSRS) is the primary reporting tool. This component
of the SQL Server stack provides a rich reporting dialect known as Report Deinition Language (RDL), a
service to render RDL-based reports, as well as client tools and designers for authoring and deploying
RDL iles.
In this chapter, you will set up a Microsoft Azure virtual machine (VM) with SSRS and build some
simple RDL reports. You will do this using two front-end tools: Report Builder and Microsoft Visual
Studio (speciically, the SSDT Business Intelligence add-in for Visual Studio) to author the reports
locally, and then deploy them to SSRS on the VM in Microsoft Azure.
SQL Reporting was, essentially, a Platform-as-a-Service (PaaS) version of SSRS that ran on
Microsoft Azure. It can help to compare Microsoft Azure SQL Reporting with on-premises SSRS
in the same way you can compare Microsoft Azure SQL Database with on-premises SQL Server;
the former is a PaaS implementation of the latter that is tailored to run on Microsoft Azure and
does not support every feature available with the on-premises product.
Unfortunately, there were several problems with SQL Reporting that prompted Microsoft
to discontinue the service and to recommend SSRS on a Microsoft Azure VM instead. Argu-
ably, the most signiicant issue was that SQL Reporting supported only SQL Database as a
data source. In contrast, SSRS supports numerous data sources, including multidimensional
123
data stored in a Microsoft SQL Server Analysis Services cube, as well as non-Microsoft data
platforms, including Oracle and DB/2. This meant that the only source of the data you can
render in reports with SQL Reporting is a relational database hosted on SQL Database. Also,
SSRS allows you to embed custom code in your reports and schedule automated report execu-
tion and delivery, whereas SQL Reporting did not support either of those features. Finally, there
were performance and pricing issues with the SQL Reporting service itself. It ran slower than
SSRS, and it could not be shut down when not needed, so charges would accrue steadily even
when no reports were being requested and served.
In the end, Microsoft deemed it best to discontinue SQL Reporting and recommend instead
the SSRS-in-a-VM solution that we cover in this chapter. This means that all on-premises capa-
bilities are available in the cloud as well; for example, you can build reports that run in the cloud
that are based on a variety of data sources, not just SQL Database. You can also embed custom
code (even code that calls into your own .NET assemblies) into your reports, schedule the
execution and delivery of reports, and do everything else that SSRS allows. Because the report
server catalog resides on the VM’s local disk, performance is similar to the performance you
might expect from an on-premises SSRS instance. And when you don’t need to serve reports,
you can simply shut down the VM to stop compute charges from accruing on your Microsoft
Azure subscription.
There are quite a number of procedures in this chapter, and together, they guide you through the
process of setting up a VM with SSRS on Microsoft Azure, designing reports locally using authoring
tools, and then deploying those reports to the VM.
You will irst create a simple report with Report Builder using the WineCloudDb database, and
deploy the report to the VM. Then you’ll move on to creating a report with the SSDT BI tools in Visual
Studio. The simple WineCloudDb database doesn’t have enough schema (tables and columns) and
data (rows) to effectively demonstrate more advanced reports, so you’ll also download and install
AdventureWorks2012 on SQL Database for the Visual Studio report. The AdventureWorks database
(available on Codeplex) has been serving as the standard sample database for SQL Server for many
years, and there is a special version of the database designed speciically for Microsoft Azure SQL
Database.
Note Many of these procedures are one-time-only installations. Once the VM, tools, and
databases are in place, it is actually remarkably fast to put together a report and deploy
it. But because the one-time installations can be quite lengthy, you should be prepared to
take a lot of coffee breaks while you wait (or perhaps, some wine?).
■ Report Builder
■ Visual Studio Report Server projects, with the SSDT Business Intelligence add-in
As you progress through this chapter, you will learn how to use both of these tools for creating and
deploying simple reports. Of course, there are numerous features and far more complex scenarios
that are possible with RDL that fall outside the scope of this chapter. However, you will complete the
chapter with a good foundation from which to grow your reporting skills.
In this procedure, you will get a VM up and running quickly by selecting a predeined image from
the Microsoft Azure VM gallery. This image already has SQL Server with SSRS installed, which gets you
almost all the way there. However, SSRS is not conigured in the VM, nor does the VM have the neces-
sary irewall rule to allow access to the SSRS over TCP port 80. Furthermore, a VM endpoint must be
conigured to match the irewall rule. So in the next few procedures, you will do the following:
3. Create a irewall rule in the VM that allows access to the reporting service.
4. Create a corresponding endpoint for the VM in the Microsoft Azure management portal that
allows access to the reporting service.
1. Log in to the Microsoft Azure portal. This takes you to the ALL ITEMS page.
a. For DNS NAME, type a short but meaningful globally unique name to give the VM,
We’ll use WineCloudVM (but you’ll need to choose another name if this one is still
unavailable).
b. For IMAGE, choose the latest version of the SQL Server Standard edition running on the
latest version of Windows Server from the drop-down list. (At the time of this writing, this
is SQL Server 2012 SP1 Standard on Windows Server 2012.)
c. For USER NAME, type a name for the VM administrator account. We’ll use WineAdmin.
d. For NEW PASSWORD and CONFIRM, type and retype a strong password for the VM’s
administrator account. The password must contain at least eight characters and include a
combination of uppercase and lowercase letters, digits, and symbol characters.
e. For REGION/AFFINITY GROUP, choose the region to host the VM from the drop-down
list. To avoid bandwidth charges (as discussed in Chapter 2, “Coniguration and pricing”),
select the same region in which the SQL Database server with the database you’re report-
ing on is hosted. Your screen should appear similar to Figure 6-1.
f. Click CREATE A VIRTUAL MACHINE. It can take a few minutes to create the VM before the
portal indicates that it has been started, as shown in Figure 6-2.
FIGURE 6-2 The Microsoft Azure portal showing that the VM has been started
Continuing from the previous procedure in the Microsoft Azure management portal, the row for
the VM you just created should still be selected. To conigure SSRS in the VM, follow these steps:
a. Click the CONNECT button at the bottom of the page. This generates an .rdp ile for the
remote desktop session and sends it to the browser.
b. When prompted to open or save the .rdp ile, click Open, as shown in Figure 6-3.
FIGURE 6-3 Downloading the .rdp ile to start a Remote Desktop session with the VM
c. When prompted that the publisher of the remote connection can’t be veriied, click
Connect.
d. When prompted by the Windows Security dialog, type the user name (WineAdmin,
for example) and password you assigned to the VM administrator account in the previ-
ous procedure. Note that the user name might already be set, and you might only be
prompted to supply the password.
e. Click OK.
f. When prompted that the identity of the remote computer can’t be veriied, click Yes. This
will start a Remote Desktop session and log you in to the VM.
a. From the VM’s Start screen, you can either scroll through the tiles to ind it or just type
reporting services to run an app search, and then click on the Report Services Conigu-
ration Manager tile.
b. In the Reporting Services Coniguration Connection dialog, click Connect. This displays
the coniguration window for SSRS running in the VM, as shown in Figure 6-4.
FIGURE 6-4 Running the Reporting Services Coniguration Manager inside the VM
3. Create the virtual directory for the reporting service. This will be the web service URL that
clients use to deploy and retrieve reports, and perform other SSRS operations. It typically ends
with /reportserver.
b. Click Apply in the lower-right side of the dialog. This creates the virtual directory that
exposes the reporting service over HTTP through port 80.
4. Create the Reporting Services database. This database will be stored on the local SQL Server
instance running on the VM (unlike the SQL Databases that you will be building reports for).
This database is used internally by Reporting Services to store metadata about the reports it is
hosting.
b. Click Change Database to open the Report Server Database Coniguration Wizard.
d. Click Finish.
5. Create the virtual directory for Report Manager. This is a friendly front-end website that lets
you navigate the report folder hierarchy, render reports, and manage the folder structure,
user roles, and permissions. It typically ends with /reports.
b. Click Apply on the lower-right side of the dialog. This creates the virtual directory that
exposes Report Manager over HTTP through port 80.
6. Click Exit.
You’ve now conigured SSRS in the VM. However, until you open the irewall for port 80, all client
requests will be blocked by the VM.
1. Launch Windows Firewall With Advanced Security in the VM. From the VM’s Start screen, you
can either scroll through the tiles to ind it or just type irewall to run an app search, and then
click on the Windows Firewall With Advanced Security tile.
b. On the Protocols And Ports page, type 80 for Speciied Local Ports and click Next.
c. On the Action page, accept the Allow The Connection default setting and click Next.
d. On the Proile page, accept all the default settings and click Next.
e. On the Name page, type TCP Port 80 and click Finish. The new rule should appear at the
top of the list, as shown in Figure 6-5.
From this point forward, it is no longer necessary to work with the VM directly over Remote
Desktop. SSRS is completely conigured in the VM, and it can be accessed by clients directly using the
reporting services URL and the Report Manager URL. To log out of the VM now, follow these steps:
1. From the VM’s Start screen, click the user account name WineAdmin on the upper-right side.
This logs you out of the VM, but the VM is still running of course (and it’s also billing, as we
mentioned previously). There is still one last thing you need to do to make this VM function as an
SSRS server in the cloud. You need to create an endpoint for the VM in the Microsoft Azure manage-
ment portal. Without the endpoint, TCP requests over port 80 will get blocked from ever reaching the
VM by Microsoft Azure, and thus the rule you just created in the previous procedure inside the VM
would never even have the chance to allow client requests to SSRS.
Back in the Microsoft Azure management portal, the row for the VM should still be selected. To
create the endpoint, follow these steps:
1. Click on the name of the VM. This navigates to the Quick Start page for the VM.
2. Click the ENDPOINTS link at the top of the page, as shown in Figure 6-6.
FIGURE 6-6 Clicking the ENDPOINTS link to open TCP port 80 for the VM
5. For NAME, choose HTTP from the drop-down list. This sets the rest of the dialog values for
TCP on port 80, as shown in Figure 6-7.
FIGURE 6-7 Creating an HTTP endpoint opens TCP port 80 for the VM
6. Click Finish (the checkmark icon on the lower-right side). It takes a few moments to add the
endpoint. Wait until the UPDATE IN PROGRESS message disappears before proceeding.
With SSRS up and running in the VM, you can start thinking about creating your irst report. This
will be a simple customer list based on a variation of the WineCloudDb database you created in
Chapter 1, “Getting started with Microsoft Azure SQL Database.”
This script creates Wine, Customer, and Order tables and loads some sample data into them. Notice
the rows being inserted into the Order table at the bottom of the script. The customer ID is the irst
number after the order date, so you can see that of six orders, three of them were placed by customer
1, two by customer 2, and one by customer 4. The next number is the wine ID, followed by the quan-
tity and price for the order. With this sample data, you will build a report that groups each customer’s
orders together and calculates the sum of their orders (based on quantity and price).
1. From the Windows Start screen, launch SSMS. You can either scroll through the app tiles to
ind it (in the Microsoft SQL Server 2012 category) or just type sql server management
studio to run a search, and then click on the tile. After a brief moment, the Connect To Server
dialog appears.
b. For Authentication, select SQL Server Authentication from the drop-down list. (SQL
Database does not support Windows Authentication.)
c. For Login and Password, type the user name and password you assigned the server when
you created it.
a. In the Object Explorer, right-click the server name and choose New Query to open a new
query window connected to the master database.
c. Press F5 (or click the Execute button in the toolbar) to create the database.
6. In the Object Explorer, right-click the Databases node and choose Refresh. The WineCloudDb
database you just created should now appear.
7. Right-click the WineCloudDb database, and choose New Query to open a new query window
connected to the WineCloudDb database.
8. Type the code shown in Listing 6-1 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
9. Press F5 (or click the Execute button in the toolbar) to create the database.
You now have a new WineCloudDb database that will serve as the data source for your irst report.
Now that the VM and database are all set, you’re ready to start focusing on the report. You’ll use
Report Builder to create your irst report, and later in the chapter, you’ll use Visual Studio to create a
more advanced report.
Once your data source, dataset, and layout are set, Report Builder lets you preview the report so
that you can iteratively design and then view your report without deploying it. When you’re ready,
Report Builder can deploy your report to SSRS, which is essentially just a matter of copying the RDL
ile from your development environment to the VM where SSRS is running.
Alternatively, you can use Visual Studio to perform the same tasks using similar tools. You will do
just that to create another report later in the chapter. Unfortunately, there is (currently) no web-based
report designer available on the Microsoft Azure portal to run in the browser, so you need to down-
load either Report Builder or the SSDT BI plug-in for Visual Studio to design reports for SSRS.
3. When prompted to run or save the ile, choose Run, as shown in Figure 6-8.
4. When the installation wizard’s welcome page appears, click Next, as shown in Figure 6-9.
7. On the Default Target Server page, leave the default target server URL text box empty and
click Next.
9. When the User Account Control dialog appears, click Yes to begin the installation.
10. When setup completes, click Finish to close the installation wizard.
1. From the Windows 8 Start screen, launch Report Builder. You can either scroll through the
tiles to ind it or just type report builder to run an app search, and then click on the Report
Builder 3.0 tile.
2. On the Getting Started dialog, click Blank Report, as shown in Figure 6-10.
FIGURE 6-10 Choosing to create a blank report from the Getting Started dialog
The Report Builder window should now appear as shown in Figure 6-11.
Notice the Ofice-style ribbon user interface—with Home, Insert, and View tabs—and the large
round Ofice button in the upper left corner, which displays a menu with options for saving and
deploying reports. The main design surface already has a text box for you to type the report’s title,
and it also uses the built-in ield &ExecutionTime in a text box (at the lower right side of the report) to
display the date and time at which the report was executed on the bottom of each page.
On the left side of the window, the Report Data pane gives you access to the key elements of
your report. The most important of these are your data sources, which deine the connection to the
SQL Database that the report is querying. Next, your dataset deines exactly what data gets fed to
the report. This can be a table or view, the result of calling a stored procedure, or any Transact-SQL
(T-SQL) query that gets sent to SQL Database for processing. Data sources and datasets are typically
embedded in individual reports, but you can also deine shared data sources and shared datasets and
deploy them as reusable objects on the reporting service. This makes it easy to share the same query
and provide consistent data across multiple reports.
Once you have at least one data source, and at least one dataset that consumes data from that
data source, you are ready to design the layout (and behavior) of the report. To present numerical
data, you can use either the table or matrix control. Both of these controls render data in tabular
form; the difference lies in the way columns are handled. With both tables and matrixes, there are
a variable number of rows in the output. However, tables have ixed columns, while matrixes have
variable columns, just like they do on rows.
The Report Data pane also lets you deine report parameters, which are values that are typically
used by the dataset query to ilter the report (for example, by date range or by product category).
There is also a place to add images to the report, and the pane provides easy access to built-in ields
(a set of handy values including things like current date and time, page number, total pages, user ID,
and so on) to include in the report.
That’s more than enough information for you to get started. So go ahead and create your data
source.
1. Right-click Data Sources in the Report Data pane on the left side of the Report Builder
window, and choose Add Data Source.
4. Choose Microsoft SQL Azure from the Select Connection Type drop-down list.
5. Click the Build button on the right side of the Connection String text box. This opens the
familiar Connection Properties dialog.
6. Supply the connection information to the WineCloudDb database you created in the previous
section.
b. Choose the Use SQL Server Authentication option, and type the user name and password
you assigned the server when you created it.
c. In the drop-down list beneath the Select Or Enter A Database Name option, select the
WineCloudDb database. Or, if the drop-down list appears empty, type WineCloudDb
directly into it.
d. Click OK to close the Connection Properties dialog. The Data Source Properties dialog
should now appear similar to Figure 6-12.
In the Report Data pane, WineCloudDataSource now appears beneath the Data Sources node. Your
next task is to create a dataset.
SELECT
CONCAT(c.LastName, ‘, ‘, c.FirstName) AS [Customer Name],
CONCAT(w.Name, ‘ (‘, w.Category, ‘)’) AS [Favorite Wine],
COUNT(*) AS Orders,
SUM(o.Quantity * o.Price) AS Total
FROM
Customer AS c
LEFT OUTER JOIN Wine AS w ON c.FavoriteWineId = w.WineId
LEFT OUTER JOIN [Order] AS o ON c.CustomerId = o.CustomerId
GROUP BY
c.FirstName, c.LastName, w.Category, w.Name
ORDER BY
c.LastName, c.FirstName
Back in the FROM clause, the Customer table is also further joined on the Order table (aliased as o),
which returns each customer’s orders. This normally results in returning one row per order, which in
turn duplicates customer information in each order row belonging to that customer. However, using
a GROUP BY clause and the COUNT and SUM functions for the Total column, this query aggregates
(summarizes) the related order rows into a single row with the number of orders and the order total,
for each customer. Thus, the query still returns exactly one row per customer. Note that the SUM
function is dynamically calculating each order’s total by multiplying the order quantity and price.
This is because our Order table doesn’t have a total column with this information. (A real database
probably would, but our simple WineCloudDb database doesn’t.) Thus, for each individual order row
belonging to a customer, that order’s total is calculated as quantity multiplied by price, and then that
total is aggregated (summed) across all the order rows for the customer.
In the next procedure, you will create a dataset for the report based on this query. Although you
will embed the query from Listing 6-2 directly into the report, you could just as easily create a stored
procedure and put the query there. Using a stored procedure offers an alternative to shared datasets
for reusing queries across multiple reports, because then even different reports that use their own
(nonshared) datasets can still be fed the same data by calling the same stored procedure.
1. Right-click Datasets in the Report Data pane on the left side of the Report Builder window,
and choose Add Dataset.
5. Type the code in Listing 6-2 into the Query text box. The Dataset Properties dialog should
now appear similar to Figure 6-13.
The Report Builder window should now appear similar to Figure 6-14.
FIGURE 6-14 The Report Builder window with a data source and dataset conigured
1. Click inside the text box in the upper-left portion of the report’s design surface (where it says
Click To Add Title), and type Wine Customers.
2. Click Insert at the top of the window to display the Insert ribbon.
3. Click Table in the Insert ribbon, and choose Table Wizard, as shown in Figure 6-15. This
displays the New Table Or Matrix wizard.
5. In the Available Fields list box, click the irst ield, Customer_Name.
6. Hold down the SHIFT key and click on the last ield, Total. This selects all the available ields.
7. Drag the selected ields from the Available Fields list box, and drop them in the Values list box,
as shown in Figure 6-16.
9. Click Next to advance to the Choose A Style page. The Ocean style is selected by default.
10. Click Finish to complete the wizard, and add the table to the report design surface.
11. Click in any of the table’s cells to display the gray selection bars on the left and top sides of
the table.
12. Click and drag on the lines between columns in the gray selection bar on the top side of the
table to widen the Customer Name column.
13. Repeat the previous step to widen the Favorite Wine column.
14. Click the [Sum(Total)] cell beneath the Total column header.
15. Click the dollar-sign button in the Number panel of the Home ribbon at the top of the win-
dow. This will cause the order total value to be formatted as currency in the report (rather
than an ordinary decimal number).
16. Click any unused (white) area of the report design surface to hide the gray selection bars
from the table. The completed report design should appear as shown in Figure 6-17.
1. Click the Save icon in the Quick-Access Toolbar at the top of the Report Builder window
(above the ribbon), as shown in Figure 6-18. Alternatively, press CTRL+S. This displays the Save
As Report dialog which, by default, is set to save the report locally in your Documents folder.
3. Click Save.
4. Click the Run button on the Home ribbon to execute the report, as shown in Figure 6-20.
5. The report should appear similar to Figure 6-21. After viewing the report, click the Design
button on the Run ribbon to return to the report designer.
The preview shows that the query correctly returned the count and sum of each customer’s orders.
The query also correctly formatted the customer and wine names. Satisied with the report, you are
now ready to deploy it so that it’s available to users in the cloud.
1. In Report Builder, click the round Ofice button at the top left of the window and click Save As,
as shown in Figure 6-22.
3. In the Connect To Report Server dialog (which can take a few moments to appear), type
WineAdmin and the password you assigned to the VM’s administrator account.
4. Select the Remember My Password check box to prevent Report Builder from prompting you
for the credentials when performing future deployments, and then click OK.
FIGURE 6-23 Saving the report to the Microsoft Azure VM running SSRS
The report is now running in the VM on Microsoft Azure, and it can be accessed from any browser.
Currently, administrator account WineAdmin is the only user, but you can create other users with
varying permission levels, as we discuss at the end of the chapter.
If you navigate your browser to the reporting service URL, you will experience a rather bare-bones
interface with simple blue hyperlinks on plain white pages. These hyperlinks let you navigate the
folders and subfolders in the hierarchy of reports, and choose any report to render in full idelity. The
Report Manager URL offers a friendlier interface to navigate the report folder hierarchy and render
reports. It also provides many other features you can use to manage the hierarchy structure, users,
groups, permissions, and roles.
To run the deployed report from your browser using Report Manager, follow these steps:
3. If prompted for credentials by the Windows Security dialog, type WineAdmin and the
password you assigned to the VM’s administrator account. The Report Manager home page is
displayed with links for all the available folders and reports on the server. At this point, there is
only one link for the CustomerList report in the root folder, as shown in Figure 6-24.
FIGURE 6-24 The Report Manager home page with links to available reports on the VM
4. Click the CustomerList report link to run the report, as shown in Figure 6-25.
Before closing the browser, take the time to discover the features exposed by the tool bar at the
top of the page. You can page through the report (although this simple report has only one page),
increase the magniication, search for text within the report, export the report to various formats, and
print the report. These standard capabilities are available for every report rendered by SSRS.
Note Some versions of Internet Explorer require the compatibility view to be set for
Microsoft Azure VMs that serve reports using SSRS. Otherwise, the toolbar does not render
correctly at the top of the page, and it appears as multiple toolbars with individual items
rather than a single toolbar with multiple items, as shown in Figure 6-25. If you experience
this behavior, drop down the Tools menu in Internet Explorer and choose Compatibility
View Settings. Click Add to set the compatibility view for all cloudapp.net websites. Then
click Close to close the Compatibility View Settings dialog. Internet Explorer will refresh the
page automatically, and the toolbar will render correctly.
Unfortunately, the BI project support (whether you call it BIDS or SSDT) has not been very
well aligned with the Visual Studio product release cycles. For years after the release of Visual
Studio 2010, BIDS ran only under the Visual Studio 2008 shell, requiring developers building
.NET applications and reports to toggle between the two Visual Studio versions. Then, in 2012,
SSDT replaced BIDS (while introducing new relational database tooling) and inally brought
unity to all project types under the single Visual Studio 2010 shell. However, that pleasure
was short-lived once Visual Studio 2012 was released, where SSDT lost the BI project support
for Reporting Services, Analysis Services, and Integration Services and retained only the new
relational database tools and features. And at the time of this writing, Visual Studio 2013 still
does not have SSDT BI support. Thus, you need to download and install the SSDT BI support to
create Report Server projects in the Visual Studio 2012 shell (a procedure you will see coming
up shortly), even if you are running Visual Studio 2013 otherwise.
Essentially, however, it has always been (and continues to be) the Visual Studio shell that
provides project templates, designers, and deployment tools for Reporting Services. So despite
the sometimes awkward brand names and untimely release cycles, we’ll often refer to it simply
as Visual Studio.
In this section, you will learn how to use Visual Studio to design and deploy Report Server projects
to SSRS. But irst, you’ll do two things in preparation:
• This sample database is much larger than WineCloudDb, and it will serve as a better
source of reporting data for your next report. Furthermore, it will help you explore many
additional reporting capabilities beyond what we cover in this chapter.
■ Install the SSDT Business Intelligence project templates for Visual Studio.
• If you’re running Visual Studio 2012, you’ll need these project templates to create Report
Server projects.
• If you’re running Visual Studio 2013, you’ll still need these project templates, which will run
in the Visual Studio 2012 shell.
• If you’re running Visual Studio 2010, you already have these project templates.
2. Navigate to http://msftdbprodsamples.codeplex.com/releases/view/37304.
Note This URL might have changed by the time this book goes to press. In this
case, run an Internet search for “download adventureworks2012 for windows
azure sql database” to ind the updated link.
4. When prompted to open or save, click the drop-down portion of the Save button and choose
Save As, as shown in Figure 6-26.
5. In the Save As dialog, navigate to any folder of your choice (or accept the default Downloads
folder) and click Save.
6. When the download completes, click the Open Folder button. This opens a new Windows
Explorer window to the folder where the downloaded AdventureWorks2012ForSQLAzure.zip
ile was saved.
8. In the Extract Compressed (Zipped) Folders dialog, click Extract to unzip the ile and open
a new Windows Explorer window to the extracted iles.
FIGURE 6-28 Deploying the AdventureWorks2012 database to a Microsoft Azure SQL Database server
Note This step requires the .NET Framework 3.5 (even if .NET Framework 4.5 is installed as
part of Visual Studio 2012 or 2013). If it's not already installed, this command will gener-
ate errors and a Windows Features dialog will appear prompting you to download .NET
Framework 3.5. Click on Download And Install This Feature to install the .NET Framework 3.5.
A reboot is required, after which you should open a new Windows Explorer window to the
AdventureWorks folder and restart this procedure from step 10.
In addition, this step relies on the bcp command-line utility that ships with SQL Server. If you
don’t have already have SQL Server installed on your local machine, you don’t have bcp.
However, you don't need to install SQL Server locally just to obtain the bcp utility needed
to create the AdventureWorks2012 SQL Database on Microsoft Azure. The bcp utility can be
installed without SQL Server by downloading the Microsoft ODBC Driver 11 for SQL Server
from http://www.microsoft.com/en-us/download/details.aspx?id=36434, and then download-
ing the Microsoft Command Line Utilities 11 for SQL Server from http://www.microsoft.
com/en-us/download/details.aspx?id=36433.
Tip The script creates the database very quickly, but then takes a long time to
populate the tables with data. It’s not necessary to wait for the script to com-
plete before proceeding to install SSDT Business Intelligence for Visual Studio
2012, which is the next step. Because that installation is also somewhat lengthy,
you can save a lot of time by not waiting for the AdventureWorks2012 data-
base to fully populate before starting the next install, and then waiting for both
processes to complete before creating your irst Report Server project.
FIGURE 6-29 Viewing the deployed AdventureWorks2012 database in the Microsoft Azure management
portal
Next, you’ll install the SSDT BI tools for Visual Studio so that you can start creating Report Server
projects.
Note During installation, you might be prompted to restart your computer. In this case, click
OK, and the installation procedure will resume automatically after your computer restarts.
FIGURE 6-30 The Download Center page for Microsoft SQL Server Data Tools - Business Intelligence For
Visual Studio 2012
Note This URL might have changed by the time this book goes to press. In this
case, run an Internet search for “download business intelligence for visual studio”
to ind the updated link.
3. Click the Download link. You will be given the choice to Run or Save.
5. When the User Account Control dialog appears, click Yes to start the SQL Server 2012 Setup
Wizard.
6. On the License Terms page, select the I Accept The License Terms check box and then
click Next.
7. On the Product Updates page, click Next to begin the installation process.
a. Be sure to choose the option for performing a new installation (which is selected by
default), even though SQL Server is already installed. Otherwise, the setup will fail with an
“architecture mismatch” error.
b. Click Next.
9. On the Feature Selection page, click Select All and then click Next.
10. On the Error Reporting page, click Next. When the installation completes, the Complete page
is displayed indicating success, as shown in Figure 6-31. If prompted to restart the computer,
click OK.
FIGURE 6-31 A successful installation of the Microsoft SQL Server Data Tools - Business Intelligence For
Visual Studio 2012
LISTING 6-3 The query for this report returns detailed sales information with related territory information
SELECT
soh.SalesOrderID,
DATEPART(YEAR, soh.OrderDate) AS [Year],
soh.CustomerID,
soh.TerritoryID,
terr.Name as TerritoryName,
terr.CountryRegionCode as Country,
soh.TotalDue as TotalSales
FROM
Sales.SalesOrderHeader AS soh
INNER JOIN Sales.SalesTerritory AS terr ON terr.TerritoryID = soh.TerritoryID
ORDER BY
[Year]
This example differs quite signiicantly from our previous report of wine customers. Recall from
Listing 6-2, that report’s query used a GROUP BY clause with COUNT and SUM aggregate functions to
summarize order information for each customer, so that the SQL Database query engine performed
the aggregation. SSRS merely dumped that information into a table; the number of rows returned by
the query is always the same as the number of rows in the report.
In this query, you’ll notice that there is no aggregation, meaning that the query engine is returning
order-level information that merely includes territory information (duplicated across orders in the
same territory), and it’s the reporting engine that will aggregate those totals to the territory level.
That is, the query itself returns 31,465 rows of order data, and the report you will create summarizes
that set down to 10 rows, one per territory. This means you can summarize at the database level
wherever it makes sense (or is convenient), and then summarize further if needed at the report level.
Also, by joining on the territory table and bringing in the country-territory hierarchy, you can deliver
automatic drill-down capabilities to your users (in this case, expanding and collapsing the territories
within each country). These are some great examples of the lexibility you get with RDL and SSRS.
In another difference from the previous example, this report will use a matrix rather than a table.
Notice that the query uses the DATEPART function with YEAR to extract the year of each order into
its own column. You will use this column to create a matrix so that in addition to rendering a variable
number of rows (one per territory), the report will render a variable number of columns (one per year)
as well.
1. Launch SQL Server Data Tools For Visual Studio 2012. From the VM’s Start screen, you can
either scroll through the tiles to ind it or just type sql server data tools to run an app search,
and then click on the SQL Server Data Tools For Visual Studio 2012 tile.
2. If this is the irst time you have started SQL Server Data Tools for Visual Studio 2012, you
will be prompted for default environment settings. In this case, choose Business Intelligence
Settings and click the Start Visual Studio button.
3. Click the FILE menu, and then choose New | Project to display the New Project dialog.
4. On the left side of the New Project dialog, expand Templates, Business Intelligence, Reporting
Services, and choose Report Server Project Wizard.
5. Name the solution and project AWReporting, and choose any desired location, as shown in
Figure 6-32.
FIGURE 6-32 Creating a new Report Server project using the Report Server Project Wizard
e. Choose the Use SQL Server Authentication option, and type the user name and password
you assigned the server when you created it.
g. In the drop-down list beneath the Select Or Enter A Database Name option, select the
AdventureWorks2012 database.
h. Click OK to close the Connection Properties dialog. Your screen should appear similar to
Figure 6-33.
FIGURE 6-33 Deining the report’s data source as AdventureWorks2012 on Microsoft Azure SQL Database
i. Click Next.
9. On the Design The Query page, type the code shown in Listing 6-3 into the Query String text
box (or paste it in from the listing ile downloaded from the book’s companion web site). Your
screen should appear similar to Figure 6-34.
11. On the Select The Report Type page, choose Matrix and click Next.
a. Drag TotalSales from Available Fields, and drop it in the Details list.
b. Drag Year from Available Fields, and drop it in the Columns list.
c. Drag Country from Available Fields, and drop it in the Rows list.
d. Drag TerritoryName from Available Fields, and drop it in the Rows list beneath Country.
e. Select the Enable Drilldown check box. The Report Wizard should appear as shown in
Figure 6-35.
f. Click Next.
14. In the Report Server text box on the Choose The Deployment Location page, type
http://<vmname>.cloudapp.net/reportserver (replacing <vmname> with the name
assigned to your VM), and click Next.
15. In the Report Name text box on the Completing The Wizard page, type Annual Sales By
Territory.
The wizard generates the report as shown in Figure 6-36. You can see how Visual Studio provides
many of the same features as Report Builder. You get the same report designer surface, lanked on
the left by the Report Data pane (expanded in Figure 6-36 to show the data source and dataset) and
on the bottom by the row and column groups deined for the matrix. (Notice the country-territory
hierarchy in the row groups.) The designer also has a Preview tab to run the report locally, without
deploying it.
The wizard is great for quickly bringing together all the elements of a report, but you’ll almost
always need to customize or adjust the report that it produces. In our wine customer report, we didn’t
bother formatting the numbers as currency, but business users usually like to see numbers in the
language that counts—money!
2. Click Number in the left navigation pane of the Text Box Properties dialog.
5. Select the Use 1000 Separator check box. The Text Box Properties dialog should appear as
shown in Figure 6-37.
6. Click OK.
You’re all set to preview the report, so click the Preview tab at the top of the window. At irst, the
report displays only six rows, because the country-territory hierarchy is collapsed, and the report
is showing aggregated information at the country level. As it turns out, the AdventureWorks data-
base has only one territory per country except for the U.S., which has territories deined for Central,
Northeast, Northwest, Southeast, and Southwest. Click the plus sign next to US to expand those
territories and view the total sales for the various regions within the U.S., as shown in Figure 6-38.
Let’s add a bar chart to this report that relects the same information as the table. The chart will
show total sales for each country (it won’t drill down into the territory level), across all years. You will
irst enlarge the working space for the report, and then place the chart just beneath the table. To
conigure the chart data, you will set its values to TotalSales, its category groups to Country, and its
series groups to Year.
1. Click the Design tab at the top of the window to leave the report preview and return to design
mode.
2. Click the bottom border of the report, and drag down to lengthen the height of the report’s
design surface. Give it a generous amount of vertical space to accommodate the chart.
3. Click VIEW | Toolbox to display the toolbox (if it’s not already currently visible).
4. Click and drag the Chart item from the toolbox, and drop it on the report, just below the
table, and all the way to the left.
5. In the Select Chart Type dialog, the Column chart is selected by default, as shown in
Figure 6-39. Click OK to choose it.
8. For Values, click the green plus sign and choose TotalSales.
9. For Category Groups, click the green plus sign and choose Country.
10. For Series Groups, click the green plus sign and choose Year. The designer should appear
similar to Figure 6-40.
FIGURE 6-40 Setting the values, category groups, and series groups that deine the chart’s data.
a. Click once on the Chart Title to select the text box, and then once again to enter edit
mode. Replace Chart Title with By Country, and press Enter.
b. Click once on the vertical Axis Title to select the text box, and then once again to enter
edit mode. (The text box will temporarily shift to horizontal display so that you can type.)
Replace Axis Title with Total Sales, and press Enter.
c. Click once on the horizontal Axis Title to select the text box, and then once again to enter
edit mode. Replace Axis Title with Country, and press Enter.
As shown in Figure 6-42, the Property Pages dialog shows the TargetServerURL set to the URL
you supplied to the wizard, along with several other interesting properties. For example, notice the
TargetReportFolder, which is set to AWReporting. This means that, when you deploy, the report itself
(which is the ile Annual Sales By Territory.rdl) will be created beneath a folder called AWReporting
(named, by default, after the project, but easily changed here if desired). Now click Cancel to close the
project Property Pages dialog.
To deploy the report to SSRS on the VM and then render it, follow these steps:
5. When prompted, enter the administrator user name WineAdmin and its password that you
assigned when you created the VM, as shown in Figure 6-43.
6. Click OK to deploy. When completed, Visual Studio shows the results in the Output window, as
shown in Figure 6-44.
3. If prompted for credentials by the Windows Security dialog, type WineAdmin and the
password you assigned to the VM’s administrator account. The Report Manager home page
is displayed with links for all the available folders and reports on the server. This includes the
AWReporting folder that Visual Studio deployed the report to, as shown in Figure 6-45.
FIGURE 6-45 The Report Manager home page with the AWReporting folder link
As you can see in Figure 6-46, the report looks and works the same in the browser as it did in the
designer preview inside Visual Studio.
By now, you understand the basic steps involved in working with SSRS. Whether you choose to use
Report Builder or Visual Studio Report Server projects, you deine your data sources or datasets and
lay them out in one or more reports (RDL iles). You can then preview the report locally and deploy to
the cloud whenever desired. After you deploy a report, you need to start thinking about security.
Windows authentication is convenient and easy, but it is not the best choice in all situations. SSRS
also supports several other types of authentication, including basic authentication, forms-based
authentication, and custom authentication. Basic authentication encodes the user name and pass-
word in clear text as a base-64 encoded string in the HTTP header, so it is only secure if you also
encrypt the channel to make the HTTP header unreadable, typically using Secure Sockets Layer (SSL).
Forms-based authentication is a security extension you use to manage your own user store, which
can be something like a database table or coniguration ile. If you have very particular requirements
that cannot be met by any of these supported authentication types, you can also implement custom
authentication. This is an advanced scenario that requires custom code as well as a good deal of
expertise in ASP.NET security.
Authorization is separate and distinct from authentication. Once a user is authenticated, what that
user can and cannot do is controlled by role assignments you deine using Report Manager. The least
privileged role is Browser, which just allows users to view folders and reports. The most privileged role
is Content Manager, which allows users to publish reports and gives them total permission (including
delete) to folders, reports, and report deinitions. You assign users to roles for speciic reports or
report folders, which determines whether a user can access that particular resource, or if the user can
perform a speciic operation (for example, delete a report or deploy a report). You can also create
groups of users, and then assign groups to a role. This effectively adds every user in the group to the
role, which makes it easy to manage multiple users as a single entity.
More Info Security in SSRS is sophisticated and potentially complex. The MSDN online
documentation provides a thorough and detailed treatment of SSRS security that you
should familiarize yourself with as you contemplate the security model for your particular
requirements. The documentation can be found at http://msdn.microsoft.com/en-us/
library/bb522728.aspx.
1. Log in to the Microsoft Azure portal. This takes you to the ALL ITEMS page.
3. Click on the virtual machine to select it. (Click anywhere on the row except in the name
column, or the portal will navigate you away to the VM’s Quick Start page.)
The VM uses a virtual hard disk that is stored in Microsoft Azure Blob Storage, so it retains its state
while it remains shut down. Whenever you need to start delivering reports again to your users, simply
boot the VM from the Microsoft Azure management portal to bring SSRS back online.
Summary
This chapter introduced you to SQL Server Reporting Services (SSRS) and showed you how to create a
Microsoft Azure virtual machine (VM) with SSRS to implement cloud reporting. You started by creat-
ing the VM and coniguring it for SSRS with reporting service and Report Manager URLs. You then
used the standalone Report Builder tool to create a simple customer report for the WineCloudDb
database. In the process, you created a data source and dataset for the report, designed the layout
using a table, and deployed the report to SSRS on the Microsoft Azure VM.
You then downloaded the AdventureWorks sample database to use as the data source for an
annual sales report built in Visual Studio with a Report Server project. After downloading the SSDT
Business Intelligence add-in for Visual Studio 2012, you used the Report Server Project Wizard to cre-
ate the data source and dataset, and design the table layout. After applying some custom formatting
and adding a bar chart to the report, you then deployed the report to SSRS on the Microsoft Azure
VM directly from inside Visual Studio. We ended the chapter with a high-level discussion of report
security, user authentication, and authorization.
There is certainly much more for you to discover on your own with SSRS and RDL. Now that you
understand the most important concepts and features, you have the foundation you need to further
explore these technologies and advance your cloud reporting skills.
T he Microsoft Azure platform provides a special service called Microsoft Azure SQL Data Sync. You
can use this service to automatically discover data changes made in one database and replicate
those changes to another database (or to any number of other databases). In this chapter, we’ll begin
with an overview of Microsoft Azure SQL Data Sync, and then dive into a series of procedures for you
to follow that put this cloud service to use in a number of common scenarios.
Note In this chapter, we refer to Microsoft Azure SQL Data Sync simply as SQL Data
Sync. Furthermore, as mentioned in Chapter 1, “Getting started with Microsoft Azure SQL
Database,” the term SQL Database refers speciically to Microsoft Azure SQL Database in
the cloud, whereas the term SQL Server refers speciically to local (on-premises) SQL Server.
In the sections that follow, we discuss these scenarios and explain how SQL Data Sync can be used
to implement a solution for each one of them.
173
It is incredibly easy to conigure and use SQL Data Sync. Everything happens through the Microsoft
Azure portal. No local tools are needed. You can use the portal to specify the databases you want
synchronized and the datasets within those databases (which tables and columns) to be synchronized.
You can also schedule an interval of time that controls how often SQL Data Sync will synchronize the
databases automatically, thus controlling how up to date those databases will be. The only time you
need to install something locally is when coniguring an on-premises SQL Server database for syn-
chronization. This requires the installation of a small agent component, a lightweight Windows Service
that registers local databases with SQL Data Sync.
The collection of databases to be synchronized (called reference databases) are deined within a
sync group. The reference databases in a sync group can include any number of local (on-premises)
SQL Server databases, any number of databases in the cloud (SQL Database), or any combination of
on-premises and cloud databases.
Within the sync group, one reference database is designated as the hub and all the other reference
databases act as spokes, in what can be viewed as a hub-and-spoke model. In our discussion, we will
refer to the spokes as clients, because that is the terminology used by the SQL Data Sync service and
documentation. In between the hub and each client, you can monitor and apply changes in one
direction (from hub to client), the other direction (from client to hub), or bi-directionally (two-way).
When two-way synchronization is enabled between a hub and multiple clients, data changes made
to the hub are pushed out to all the clients. A data change in any individual client is irst pushed to
the hub, and then pushed out again to all the other clients. If the same data is changed in two places
within a synchronization pass, you can control who “wins” by setting the conlict-resolution behavior
for the sync group.
The only requirement in this model is that the hub database absolutely must be a SQL Database
(that is, it must be a database in the cloud). An on-premises SQL Server database cannot function as
the hub of a sync group. All the other reference databases in the sync group (the clients) can be either
another SQL Database in the cloud or an on-premises SQL Server database.
Important SQL Data Sync is provided as a “preview” release available to all Microsoft
Azure subscribers, so support from Microsoft is not guaranteed. We normally recommend
against using preview releases like SQL Data Sync for production applications (as Microsoft
does), particularly because it is reasonable to expect that newer and more sophisticated
replication services will emerge on Microsoft Azure in the longer term.
However, SQL Data Sync is currently the only synchronization service for SQL Database
and SQL Server available from Microsoft on Microsoft Azure. It is also extremely easy to
use and reliable, and it has been freely available as a preview release since 2007 (when the
service was formerly called SQL Azure Data Sync). In fact, there are reports of customers
enjoying great success with SQL Data Sync in production environments, and even (under
special circumstances) receiving limited Microsoft support. But to reiterate, with preview
release software, there are no guarantees. You must carefully consider all these facets
before adopting SQL Data Sync and integrating it into your production solution.
In this scenario, Data Sync can be used to synchronize one-way, from the SQL Server client up to
the SQL Database hub. (See Figure 7-1.) Changes made on-premises in SQL Server will be replicated
automatically to SQL Database in the cloud. However, the cloud applications are not able to affect
the on-premises SQL Server database in any way, because with one-way synchronization set up in this
direction, SQL Data Sync will not monitor the SQL Database hub for changes that might be made in
the cloud; thus, it will never modify the SQL Server client database.
Cloud
Application
Application
SQL Server
FIGURE 7-1 One-way publishing of data mastered on-premises with SQL Server to SQL Database in the cloud
Cloud
Application
Application
SQL Server
FIGURE 7-2 One-way publishing of data mastered in the cloud with SQL Database to on-premises SQL Server
Application
Cloud
SQL Server
Application
SQL Database
Application
SQL Server
FIGURE 7-3 Two-way publishing of shared data between multiple locations via a centralized hub in the cloud
When you have multiple locations, each location runs its own set of applications and uses data in
its own SQL Server database. In some cases, the requirement is to share data between the different
SQL Server (on-premises) locations. That is, each location has some data that needs to be kept in sync
so that it’s available in all the other locations (for example, a product catalog). In this case, SQL Data
Sync uses the hub as a conduit through which data is synchronized bi-directionally. Changes are irst
replicated from the on-premises location to the hub, and then pushed back out to all the other on-
premises locations. In this manner, SQL Data Sync can update each location with changes made in any
other location, via the SQL Database hub in the cloud.
Another common scenario is to use SQL Database in the cloud as a location to aggregate
(combine) the data from these multiple locations. Basically, you can use SQL Data Sync to pull the
location-speciic data from each location and aggregate it into a centralized SQL Database. Then you
can create a cloud-based application that consumes the view of that aggregated data across those
multiple locations. This is just one example of how SQL Data Sync can provide great insight by pulling
distributed data together into a single SQL Database. The cloud-based application can even update
data in the SQL Database (either shared data or location-speciic data), causing SQL Data Sync to push
those changes back down appropriately; the service will send shared data changes to all locations,
and location-speciic data changes to just the individual corresponding locations.
Application
On-Premises Cloud
Application
SQL Database
SQL Server
Application Application
Application
Application
SQL Database
SQL Server
FIGURE 7-4 Synchronizing multiple cloud databases across multiple Microsoft Azure data centers
By maintaining multiple copies of the same data in the cloud (both within a single data center as
well as across multiple data centers), you can scale out in signiicant ways. For example, you can gen-
erate one or more replicas of your primary transactional database (often termed the OLTP database,
for online transactional processing), keeping both the OLTP and replica databases hosted within the
same data center (as depicted by the two SQL Database instances in the center of the diagram). Then
you can run your analysis and reports against a replica, rather than the “live” OLTP database.
Taking this approach greatly reduces the demand on your primary transactional database, which
needs to remain responsive to data-entry requests at a fast and furious rate. The OLTP database is
primarily focused on (and carefully tuned for) inserting, updating, and deleting small amounts of data
within atomic transactions (that is, brief series of operations that succeed or fail as a whole). Although
One of the great things about the Microsoft Azure platform is that Microsoft maintains data
centers in multiple regions around the world. This infrastructure makes it easy to geographically
locate your own applications around the world as well, if and when the need arises to scale out to that
level. Thus, you can keep applications (and their associated data) physically closer to the users con-
suming those applications. In this case, SQL Data Sync can be used to keep replicated databases (or
designated parts of replicated databases) in sync so that the same data (or the same subset of data) is
available to all instances of the application worldwide, as depicted on the right side of Figure 7-4.
Important Although the SQL Data Sync service itself is free at the time we are writing this,
your Microsoft Azure subscription will most deinitely incur normal charges for data trans-
ferred between data centers by the service. (See Chapter 2, “Coniguration and pricing,”
for more information.) Furthermore, there will be a performance hit as a result of all the
additional network trafic crossing data centers. For these reasons, if possible, you can and
should limit synchronization to include only that portion of your database (the minimum
number of tables and columns) that absolutely needs to be available globally.
You can learn more about Microsoft Azure Trafic Manager by visiting
http://www.windowsazure.com/en-us/services/trafic-manager.
As shown in Listing 7-1, WineCloudDb is a simple database that has Wine and Customer tables and
a few rows of data.
Note Because this procedure does nothing more than create a database and execute a
script, you can certainly follow similar steps with SQL Server Management Studio (SSMS)
instead of Visual Studio and SSDT. Likewise, you can choose instead to use the Microsoft
Azure management portal to create the database and the SQL Database management
portal to run the script. It’s largely a matter of preference, so you should use whichever
tool is readily available and most convenient for you.
To create the WineCloudDb database using Visual Studio 2013, follow these steps:
2. If the SQL Server Object Explorer is not visible, click the VIEW menu and choose SQL Server
Object Explorer.
3. In the SQL Server Object Explorer, right-click SQL Server and choose Add SQL Server to display
the familiar Connect To Server dialog.
b. For Authentication, select SQL Server Authentication from the drop-down list. (SQL
Database does not support Windows Authentication.)
c. For Login and Password, type the user name and password you assigned the server when
you created it.
d. Click the Connect button. The server now appears as a collapsed node in the SQL Server
Object Explorer.
b. Click OK to conirm.
9. Type WineCloudDb, and press Enter. The new database now appears in the SQL Server
Object Explorer.
10. Right-click the WineCloudDb database, and choose New Query to open a new query window.
11. Type the code shown in Listing 7-1 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
12. Press Ctrl+Shift+E to execute the script (or press the play button icon in the query window’s
toolbar).
13. Close Visual Studio. (It isn’t necessary to save the script.)
To avoid incurring charges to your Microsoft Azure subscription as you follow along with these
procedures, you’ll create the new server in the same data center as the one currently hosting
WineCloudDb. You will create the sync group in the same region as well, for the same reason. This
is ideal for scaling out within a data center, but you can just as easily create the server in a different
data center (and incur outgoing data-transfer charges between them) if you wanted to geographically
disperse the databases and better service global users in closest proximity to their nearest available
data center.
Note In Chapter 1, you already created a server, opened the irewall, and created a
database. Although the following instructions are very similar to the ones we provided
then, you might still want to refer to “Creating a server” in Chapter 1 because the
instructions there include screen shots and details that are not repeated here.
5. Provide a new server administrator login name and password, and then reenter the password
to conirm.
Tip To simplify things while you are practicing, we recommend using the same
login name (for example, saz) and password you assigned to the irst server.
6. Choose a region from the drop-down list. As we explained, you should select the same region
you chose for the irst server so that your subscription will not incur data-transfer charges
when you synchronize between the databases on the two servers.
7. Be sure the ALLOW WINDOWS AZURE SERVICES TO ACCESS THE SERVER check box remains
selected. This setting enables Microsoft Azure services like SQL Data Sync to access the server.
8. Click the checkmark icon in the lower right side of the dialog to complete the settings.
9. Wait for the new server status to change from “creating” to “started.”
Next, you’ll open the Microsoft Azure irewall, although this procedure is technically optional. SQL
Data Sync does not require you to open the Microsoft Azure irewall to synchronize across servers.
However, as explained in Chapter 1, you do need to open the irewall for your machine’s IP address if
you want to be able to access the server from the Silverlight-based SQL Database management portal
3. To the right of your current detected IP address, click ADD TO THE ALLOWED IP ADDRESSES.
5. Click the back icon (the large back-pointing arrow) to return to the SQL DATABASES page.
Now you’ll create an empty database on the new server. To do so, follow these steps:
4. For DATABASE NAME, type WineCloudDb. (Because this database will run on the new server,
it can be the same name as your existing database running on the irst server.)
Note Conversely, if you were scaling out to separate OLTP activity from
reporting activity, you could use the same server. In that case, you’d be forced
to give the replica database a unique name, which might be something like
WineCloudReportingDb.
5. Choose the server you created in the previous procedure from the SERVER drop-down list.
After a few more moments, the new WineCloudDb database is created. This database is completely
empty, but it will soon contain the same tables with the same data as the existing WineCloudDb on
the irst server, once you create the sync group, conigure the sync rules, and run a manual sync.
When creating the sync group, you will deine the hub database and the irst client database.
(A client database is also often called a reference database.)
1. Click the ADD SYNC button at the bottom of the page, and then choose New Sync Group, as
shown in Figure 7-5.
b. In the REGION drop-down list, choose the same region as the two servers. The page
should appear similar to Figure 7-6.
FIGURE 7-6 Naming the sync group and choosing a region to host it in
a. For HUB DATABASE, choose the WineCloudDb database running on the irst server (the
database that already has some data in it).
b. Enter the server’s administrator user name and password credentials for HUB USER NAME
and HUB PASSWORD.
c. Leave CONFLICT RESOLUTION at its default setting, Hub Wins. (We will discuss conlict
resolution shortly.) The page should look similar to Figure 7-7.
Note The conlict-resolution setting cannot be changed once the sync group
is created. You need to delete the sync group and re-create it if you want to
change the setting.
d. Click the right-arrow Next button. Because of the way the portal handles validation on
this page, you need to wait a moment after clicking it once, and then click it again.
a. For REFERENCE DATABASE, choose the WineCloudDb database running on the new server
(the empty database you just created). You’ll notice that the database selected for the
hub is not even present as a choice in the drop-down list.
c. Leave SYNC DIRECTION at its default setting, Bi-Directional. The page should look similar
to Figure 7-8.
Note The two other available sync direction settings are one-way from the hub
to the client and one-way from the client to the hub, either of which you could
choose to use rather than bi-directional, if doing so makes sense for the given
scenario. (Different one-way scenarios were discussed at the beginning of this
chapter.) Also note that the sync direction setting cannot be changed once the
sync group is created. You need to remove a client database from the sync group
and then add it again if you want to change the setting.
d. Click the checkmark “inish” button to complete the settings. Once again, you might need
to wait a moment after clicking it once, and then click it again.
As shown in Figure 7-9, the sync group has been created at this point, but it is not ready. And it will
remain in this Not Ready state until you further conigure the sync group with sync rules.
Also notice that the portal has added a third link (SYNC) after the standard DATABASES and
SERVERS links on the SQL Databases page. When at least one sync group has been created, the SYNC
link takes you to this management view of all your sync groups.
To conigure the sync rules and deine the dataset, follow these steps:
1. Click the sync group name WineCloudSyncGroup. This opens to the REFERENCES view that lists
all the databases in the sync group, as shown in Figure 7-10. Notice that the status of each
database is Not Ready, again, because no sync rules have yet been conigured.
3. Click DEFINE SYNC RULES beneath the message stating that you have no sync rules.
4. On the DEFINE DATASET dialog, select the WineCloudDb database running on the irst server
(the one with pre-existing tables and data, not the new empty one).
5. Click the checkmark icon to close the dialog. The SYNC RULES page now lists all the tables
discovered in the WineCloudDb database, as shown in Figure 7-11.
FIGURE 7-11 The SYNC RULES page listing all the tables in the database that can be synchronized
FIGURE 7-12 Selecting the entire database (all columns in all tables) for synchronization
7. Click the SAVE button at the bottom of the page, as shown in Figure 7-13.
8. You might need to wait a few moments for processing before the portal becomes responsive
again. When processing completes, click the REFERENCES link at the top of the page. Both
databases are now designated with a status of Good, as shown in Figure 7-14.
FIGURE 7-14 The REFERENCES page now showing all the databases in the sync group with a Good status
9. Click the back icon (the large back-pointing arrow) to return to the SQL DATABASES page for
the new sync group. As shown in Figure 7-15, the sync group has now also transitioned to a
status of Good.
At this point, SQL Data Sync has created tables in the new database exactly as they are deined in
the existing database.
Important SQL Data Sync does more than just replicate the tables you chose to be
synchronized; it also adds special objects to each database in the sync group (including
on-premises SQL Server databases). To track incremental data changes, SQL Data Sync
creates a change-tracking table for each table that is being synchronized, adds triggers
to your base tables, and also creates some stored procedures for gathering and apply-
ing changes. Therefore, it is highly recommended that you irst test SQL Data Sync in a
nonproduction environment to ensure it does not have an adverse effect on your existing
databases or applications.
You’ll create an automated schedule shortly, but right now you’ll start by running a manual sync.
To do so, follow these steps:
1. Click the SYNC button at the bottom of the page, as shown in Figure 7-16. After a few
moments of processing, the sync group returns to its normal Good status. This means the
synchronization was successful.
2. Wait a few moments for processing to complete, and click the sync group name
WineCloudSyncGroup.
3. Click the LOGS link at the top of the page. The portal displays a message conirming that the
databases synchronized successfully, as shown in Figure 7-17.
Now you’ll rely on your own two eyes to verify that, indeed, everything is working as it should. The
best way to do that is to monitor both databases side by side as you change them individually, and
then synchronize those changes. You can do this by opening two separate browser tabs to the SQL
Database portal—one for each WineCloudDb database.
Note In Chapter 1, you already saw how to launch a new browser tab to the SQL Database
portal. Although the following instructions are very similar to the ones we provided then,
you might still want to refer to Chapter 1 because the instructions there include screen
shots and details that are not repeated here.
1. Click the back icon (the large back-pointing arrow) to return to the SQL DATABASES page.
5. Scroll the page down a bit, and click on the MANAGE URL link in the quick glance section at
the right of the page. This opens a new browser tab to the SQL Database portal’s login page
for the irst WineCloudDb database.
6. On the login page, type the administrator user name and password, and click Log On. This
takes you to the Summary page for the database.
7. Click the previous browser tab to return to the Microsoft Azure portal that just launched the
new browser tab.
8. Click the back icon (the large back-pointing arrow) to return to the SQL DATABASES page.
11. Scroll down and click on the MANAGE URL link on the right to open another new browser tab
to the SQL Database portal’s login page for the second WineCloudDb database.
12. On the login page, type the administrator user name and password, and click Log On to go to
the Summary page for the database.
Now that your environment is all set up, start by running the same simple query in each database
to view and compare the contents of the Wine table. To run the queries, continue working in the cur-
rent browser tab that’s open to the SQL Database management portal for the second WineCloudDb
database, and follow these steps:
1. Click New Query at the top of the page to open a new query window.
2. Click inside the code window, and type SELECT * FROM Customer.
3. Click Run at the top of the page. SQL Database executes the query and displays the results in
the bottom portion of the query window, as shown in Figure 7-18.
4. Click the browser tab that’s open to the SQL Database management portal for the irst
WineCloudDb database.
5. Click New Query at the top of the page to open a new query window.
6. Click inside the code window, and type SELECT * FROM Customer.
7. Click Run at the top of the page. SQL Database executes the query and displays the results,
which should appear identical to the results of the same query you ran in the other database,
as just seen in Figure 7-18.
Receiving identical query results in both browser tabs is a clear indication that the irst manual
sync worked correctly and both databases are in sync. Now update some data on both sides and
watch the databases sync up once again. To run the updates, continue working in the current browser
tab and follow these steps:
1. On a new line below the SELECT statement you typed in the previous procedure, type
UPDATE Customer SET FavoriteWineId = 2 WHERE CustomerId = 1.
2. Click and drag the mouse to highlight the complete UPDATE statement.
4. Click the browser tab that’s open to the SQL Database management portal for the other
WineCloudDb database.
5. On a new line below the SELECT statement, type INSERT INTO Customer VALUES(‘Chris’,
‘Mayo’, 3).
6. Click and drag the mouse to highlight the complete INSERT statement.
7. Click Run at the top of the page. SQL Database executes the statement and indicates that one
row was affected. That is, a new customer row for Chris May was created with a favorite wine
ID of 3.
You have now modiied each database separately; one database has an updated customer, and the
other has a new customer. Next, you will perform another manual sync operation and then conirm
visually that each change has been synchronized properly to the other database. To do so, follow
these steps:
1. Click the browser tab that’s open to the Microsoft Azure portal. It should still be on the
DASHBOARD page for one of the WineCloudDb databases from one of the earlier procedures.
2. Click the back icon (the large back-pointing arrow) to return to the SQL DATABASES page.
4. Click the SYNC button at the bottom of the page. After a few moments of processing, the sync
group returns to its normal Good status.
5. Click the browser tab that’s open to the SQL Database management portal for one of the
WineCloudDb databases. (It doesn’t matter which.)
6. Click and drag the mouse to highlight the complete SELECT statement.
7. Click Run at the top of the page. SQL Database executes the query and displays the results,
which relect both the updated and inserted customer changes, as shown in Figure 7-19.
8. Click the browser tab that’s open to the SQL Database management portal for the other
WineCloudDb databases.
9. Click and drag the mouse to highlight the complete SELECT statement.
10. Click Run at the top of the page. SQL Database executes the query and displays the results,
which should once again appear identical to the results of the same query you ran in the other
database, as just seen in Figure 7-19.
Receiving identical query results again in both browser tabs now veriies that bi-directional
synchronization is working. The next thing to learn about is conlict resolution. If the same customer
row is modiied differently in both databases at the same time (meaning, in between synchroniza-
tions, not necessarily simultaneously), what happens on the next synchronization?
When following this next procedure, you need to pay attention to which browser tab is open to
the SQL Database management portal for the hub database, and which is open to the portal for the
client database. You can distinguish them by the server names that appear in the upper left part of
the portal page in each browser tab.
1. Click the browser tab that’s open to the SQL Database management portal for the
WineCloudDb hub database. (Check the server name in the upper left of the page.)
a. In the code window, delete the existing INSERT or UPDATE statement (but leave the
SELECT statement as-is).
b. On a new line below the SELECT statement, type UPDATE Customer SET LastName =
‘Mayo-Hub’ WHERE CustomerId = 4.
c. Click and drag the mouse to highlight the complete UPDATE statement.
d. Click Run at the top of the page. SQL Database executes the statement and indicates that
one row was affected.
2. Click the browser tab that’s open to the SQL Database management portal for the other
WineCloudDb database (the one on the new server).
a. In the code window, delete the existing INSERT or UPDATE statement (but leave the
SELECT statement as-is).
b. On a new line below the SELECT statement, type UPDATE Customer SET LastName =
‘Mayo-Client’ WHERE CustomerId = 4.
c. Click Run at the top of the page. SQL Database executes the statement and indicates that
one row was affected—the same row you just modiied in the hub database (with a dif-
ferent change).
3. Click the browser tab that’s open to the Microsoft Azure portal. It should still be on the SYNC
page from the manual sync you ran in the previous procedure.
4. Click the SYNC button at the bottom of the page. After a few moments of processing, the sync
group returns to its normal Good status.
5. Click the browser tab that’s open to the SQL Database management portal for the
WineCloudDb hub database.
a. Click and drag the mouse to highlight the complete SELECT statement.
FIGURE 7-20 The hub change remains after a conlict with a client, using Hub Wins conlict resolution
6. Click the browser tab that’s open to the SQL Database management portal for the other
WineCloudDb database (the client database on the new server).
a. Click and drag the mouse to highlight the complete SELECT statement.
b. Click Run at the top of the page. SQL Database executes the query and displays the
results. As shown in Figure 7-21, the customer name change to Mayo-Client has been
overwritten by the conlicting name change (Mayo-Hub) made to the same row in the
hub database.
If the conlict resolution had been set to Client Wins, the opposite would have occurred; the
change made to the client database would have remained on the client, and the change made to the
hub would have been overwritten to relect the client change (that is, the customer would have the
name Mayo-Client in both databases).
When a sync group has only one client database in it, the behavior with Hub Wins or Client Wins
is absolutely predictable, and it will always work as we just described. But once two or more client
databases are involved, and there are conlicting changes across multiple clients (but not the hub),
conlict-resolution behavior cannot be predicted, regardless of the setting:
■ In the case of Hub Wins, the irst client change that gets written to the hub is kept. Any
conlicting data changes made in any of the other clients are discarded. Then the change
written to the hub by the irst client is propagated out to all the other clients.
■ In the case of Client Wins, conlicting data changes made in all clients are written to the hub,
each one overwriting the previous one, so that the changes written by the last client are then
propagated out to all the other clients.
Fortunately, SQL Data Sync supports a simple scheduling mechanism. Automation is either turned
on or off. If automation is turned on, the schedule frequency can range from (approximately) ive
minutes to one month. This means the closest you can keep the databases in your sync group up to
date by is ive minutes.
Note We say approximately parenthetically, because the service does not guarantee
precise timing. For example, you might request to run every 10 minutes and ind
occasionally that 11 minutes elapse between two passes.
To set an automated sync schedule that runs every 10 minutes, follow these steps:
1. Click the browser tab that’s open to the Microsoft Azure portal. It should still be on the SYNC
page from the manual sync you ran in previous procedures.
6. Click anywhere on the page. This is necessary to shift focus away from the SYNC FREQUENCY
text box, which causes the SAVE button to appear at the bottom of the page, as shown in
Figure 7-22.
The schedule is now set. To test the schedule, follow these steps:
1. Click the browser tab that’s open to the SQL Database management portal for one of the
WineCloudDb databases. (It doesn’t matter which one.)
2. In the code window, delete the existing INSERT or UPDATE statement (but leave the SELECT
statement as-is).
3. On a new line below the SELECT statement, type UPDATE Customer SET LastName =
‘Mayo-Auto’ WHERE CustomerId = 4.
4. Take a break for 10 minutes (or more). Have a glass of wine (or two). You’ve earned it!
5. Run the SELECT statement in each of the SQL Database management portal browser tabs
open to the hub and the client, and notice how the customer name is now Mayo-Auto in both
databases.
During your 10-minute break, an automatic synchronization pass ran, which copied the change to
the other database. At this point, you can leave things alone and let the service synchronize every 10
minutes automatically, although you can still sync manually any time you want to, of course. You can
If you have access to a SQL Server instance you can create a local database on, you can use that
SQL Server instance. Otherwise, you will need to install the SQL Server Express edition to host a local
database so that you can continue following along. A step-by-step procedure for doing so can be
found in the Introduction, in the section “Installing SQL Server Express Edition.”
Note This chapter assumes you are using the SQL Server Express edition for your local SQL
Server database, which has an instance name of .\sqlexpress. If you are using another edi-
tion, you must replace the instance name .\sqlexpress speciied in the instructions with the
name of the instance you are using. For example, if you are running a primary instance of
the SQL Server Developer edition on your local machine, you can simply specify the dot (.)
symbol or localhost. If you are running a named instance on your local machine, append
a backslash followed by the name of the instance (for example, .\myinstance or localhost\
myinstance).
In the next procedure, you will create a new database on your local SQL Server instance called
WineLocalDb. The local database will start out completely empty, but it will soon become a replica of
the other WineCloudDb databases once you add it to the sync group. (This is the same approach you
took earlier when you created the second WineCloudDb database and added it to the sync group.)
2. If the SQL Server Object Explorer is not visible, click the VIEW menu and choose SQL Server
Object Explorer.
3. In the SQL Server Object Explorer, right-click SQL Server and choose Add SQL Server to
display the Connect To Server dialog.
4. For Server Name, type .\sqlexpress (or the name of your local SQL Server instance).
5. For Authentication, choose Windows Authentication. Or, if your local SQL Server instance is
conigured not to support Windows Authentication, choose SQL Server Authentication and
6. Click Connect. The local instance now appears as a collapsed node in the SQL Server Object
Explorer.
9. Type WineLocalDb and press Enter. The new database now appears in the SQL Server
Object Explorer, as shown in Figure 7-24.
FIGURE 7-24 Creating the WineLocalDb database on a local SQL Server instance
You now have an empty on-premises database named WineLocalDb. Next, you will conigure a
sync agent so that this database can participate in the sync group.
With this approach, the SQL Data Sync service does not communicate directly with the local
database; instead, all local-to-cloud communications take place through the agent. This means that
the SQL Data Sync service in the cloud can access your on-premises SQL Server databases, even if
they are located behind a irewall (which is typically the case in production environments). When the
service communicates with the agent, it does so using encrypted connections and a unique token or
agent key. The SQL Server databases authenticate the agent using the connection string and agent
key, which provides a high level of security.
1. Log in to the Microsoft Azure portal (or return to the browser window where the Microsoft
Azure portal is running, if it’s still open).
3. Click the ADD SYNC button at the bottom of the page, and then choose New Sync Agent,
as shown in Figure 7-25.
4. Click the Install One Here link, as shown in Figure 7-26. This opens a new browser tab to a
download page for the sync agent.
8. If you receive a pop-up warning, click Allow Once, as shown in Figure 7-28.
9. When prompted to run or save the ile, choose Run. This downloads and starts the Microsoft
SQL Data Sync Agent Preview Setup Wizard:
b. On the License Agreement And Privacy Information page, click I Agree and then click
Next.
c. Type a local Windows user name and password for the account that the agent service
should run under. The user name should include the local domain or machine name
followed by a backslash, as shown in Figure 7-29.
FIGURE 7-29 Coniguring the Windows account that the local agent will use to access on-premises SQL
Server databases.
d. Click Next.
1. Return to the browser tab that’s still open to the Microsoft Azure portal on the New Sync
Agent page.
3. In the REGION drop-down list, choose the same region as the time zone your local machine is
set to. The page should appear similar to Figure 7-30.
Important You will not be able to register local databases with a sync agent
located in a different region.
4. Click the checkmark icon in the lower-right side of the dialog to complete the settings.
5. After a few moments, the agent is created and appears ofline, as shown in Figure 7-31.
6. Click WineSyncAgent. This displays the local databases registered with the agent, which is cur-
rently an empty list.
9. Click the clipboard copy button to the right of the generated key, as shown in Figure 7-32.
(If you are prompted for clipboard access by Internet Explorer, click Allow Access.)
FIGURE 7-32 Generating an access key that can be used to register local SQL Server databases with
the agent
1. From the Windows Start screen, launch Microsoft SQL Data Sync Agent Preview. You can
either scroll through the tiles to ind it or just type data sync agent to run an app search, and
then click on the Microsoft SQL Data Sync Agent Preview tile, as shown in Figure 7-33.
FIGURE 7-33 Launching the SQL Data Sync agent from the Windows 8 Start screen.
4. Right-click in the Agent Key text box, and choose Paste. This pastes the key generated on the
portal, as shown in Figure 7-34.
FIGURE 7-34 Providing the service-generated access key to the local agent service
5. Click OK.
6. Click the Ping Sync Service button. You should receive a message dialog stating that the agent
successfully pinged SQL Data Sync, which you can close by clicking OK.
a. For Authentication, choose Windows Authentication. Or, if your local SQL Server instance
does not support Windows Authentication, choose SQL Server Authentication and supply
valid credentials for Login and Password.
b. For Server Name, type .\sqlexpress (or the name of your local SQL Server instance).
FIGURE 7-35 The SQL Server Coniguration dialog, which is used to register a local SQL Server database
with Microsoft Azure SQL Data Sync
d. Click Save. The database is registered with the local agent service, as shown in
Figure 7-36.
FIGURE 7-36 Registering an on-premises SQL Server database with the local SQL Data Sync agent service
8. Return to the browser tab that’s still open to the Microsoft Azure portal on the Manage
Access Key page.
9. Click the checkmark icon in the lower-right side of the dialog to close the Manage Access Key
page.
FIGURE 7-37 Viewing on-premises SQL Server databases registered with the agent
Now that the on-premises WineLocalDb database is registered with the agent, it’s easy to add it to
the sync group. To do so, follow these steps:
1. Click the back icon (the large back-pointing arrow) to return to the SYNC page. As shown in
Figure 7-38, the sync agent status has now transitioned from Ofline to Online. (It might be
necessary to refresh the page by pressing F5 to see the transitioned status.)
FIGURE 7-38 The WineSyncAgent now appears online in the Microsoft Azure portal
3. Click the ADD button at the bottom of the page to display the Add A Reference Database
dialog.
a. For REFERENCE DATABASE, choose the WineLocalDb database that appears beneath SQL
Server Databases - WineSyncAgent, as shown in Figure 7-39.
FIGURE 7-39 Selecting the on-premises database registered with the agent for inclusion in the sync
group.
b. If the local database was registered using Windows Authentication, no credentials are
required and the USER NAME and PASSWORD text boxes will be disabled. Otherwise,
enter the user name and password you supplied when you registered the local database
in the previous procedure’s step 7.
c. Leave the SYNC DIRECTION at its default setting, Bi-Directional. The page should look
similar to Figure 7-40.
5. Click the SAVE button at the bottom of the page. After a few moments of processing, the
database is added to the sync group, and all three databases (the two WineCloudDb databases
and the one on-premises WineLocalDb database) are designated with a status of Good, as
shown in Figure 7-41.
FIGURE 7-41 The sync group conigured with two Microsoft Azure SQL Databases and one on-premises
SQL Server database
7. Switch back to Visual Studio, which should still be open from when you created the empty
WineLocalDb database in a previous step.
8. In the SQL Server Object Explorer, expand the WineLocalDb database node, and then
expand the Tables node nested beneath the database node.
9. Right-click the dbo.Customer table, and choose View Data. As shown in Figure 7-42, the table
now contains all the customer data pulled in from the WineCloudDb hub database.
FIGURE 7-42 The on-premises SQL Server database now containing the customer data synchronized from
the WineCloudDb hub database
Congratulations! You just created a fully functional sync group that synchronizes data
bi-directionally between multiple SQL Databases on Microsoft Azure and an on-premises SQL
Server database running locally. The client databases (the second WineCloudDb database and the
on-premises WineLocalDb database) are both conigured for bi-directional synchronization, so
any changes made in any of the databases in the sync group (including the hub) will be replicated
automatically across all the other databases.
Furthermore, because the sync group’s conlict resolution is set to Hub Wins, changes made to
rows in the hub database will overwrite changes made to the same rows in any of the client databases
if those changes are made within the same sync interval. And, as explained in the section “Establish-
ing Conlict Resolution,” if conlicting changes are made across only client databases but not the hub
Next, limit your synchronization to include just the items you need to sync. As you saw, SQL Data
Sync doesn’t require the entire database to participate in a sync group, so you should always select
the fewest tables and columns possible when you conigure your dataset. This practice improves
performance by reducing the overall payload of a synchronization pass.
Another consideration to bear in mind is the frequency with which a synchronization pass occurs.
If a pass attempts to synchronize a sync group that has not yet completed a prior synchronization,
the attempt will fail. When planning a schedule, take care that you set the interval suficiently large
enough to ensure that synchronization completes before the next synchronization pass starts. Also,
remember that the intervals are approximate, and that the inest automation schedule you can
implement is once every ive minutes.
The sync schedule can affect your SQL Database costs as well. Although at the time of this writing
Microsoft offers SQL Data Sync as a free service, SQL Database fees are charged according to the
amount of data moved out of a data center. To minimize costs, you should consider dividing data into
separate sync groups according to the frequency with which the data changes. Volatile data should
be synchronized at a higher frequency than static or lookup data. Partitioning sync groups in this way
allows you to conigure an optimal schedule that helps reduce costs by sending data less frequently.
One pitfall to avoid when setting up multiple sync groups is a condition known as a sync loop. A
sync loop occurs when a change in a record in one sync group is rewritten to the same record by a
second sync group, similar to a circular reference. This highly undesirable condition can potentially
enter an ininite loop and consume enough resources to signiicantly degrade performance. Further-
more, you will pay fees for moving data into and out of SQL Database unnecessarily. You can avoid
sync loops in a few ways:
■ Design your sync groups such that a loop cannot occur; that is, don’t let the same table be
synchronized by two different sync groups.
■ Encrypted SQL Server connections are further secured using an agent key.
Summary
This chapter taught you all about SQL Data Sync, a Microsoft Azure service that provides automatic
data replication across any number of SQL Databases hosted on Microsoft Azure as well as local SQL
Server databases hosted on-premises.
We began by explaining the hub-and-spoke architecture upon which SQL Data Sync is based, and
then described the variety of scenarios in which the service can be conigured. This includes one-way
replication from data mastered in SQL Server on-premises to SQL Databases on Microsoft Azure, one-
way replication in the reverse direction to pull data populated in Microsoft Azure SQL Database down
to an on-premises SQL Server database, and full bi-directional synchronization across SQL Databases
hosted in multiple Microsoft Azure data centers and local SQL Server databases hosted in multiple
on-premises locations.
With that foundation laid, you then created a sync group to synchronize between two Microsoft
Azure SQL Databases in the cloud, learned about the two different conlict-resolution strategies (Hub
Wins and Client Wins), and set up an automated schedule to keep the databases in sync on a regular
basis. Finally, you installed and conigured a sync agent and registered a local on-premises SQL Server
database with the sync group to achieve complete two-way synchronization between the local SQL
Server database and the SQL Databases in the cloud.
W hen developing applications and systems intended for real use, performance is an important
consideration. Today’s users have short attention spans and need immediate results, which
means their applications must be responsive and deliver results quickly. Data lies at the core of many
systems, and often that data is stored in relational databases like SQL Database. Optimizing data
access can often provide signiicant improvements and beneits to application performance.
In this chapter, you will optimize and tune database performance for Microsoft Azure SQL
Database. This includes optimizing execution speed and performance, as well as reliability. To
demonstrate these concepts throughout this chapter, a reference application is needed. In the second
section, “Creating a RESTful Web API,” you will create an ASP.NET Web API that works with data from
SQL Database using both ADO.NET and Entity Framework (EF). Then you will improve the reliability
and performance of this Web API by managing database connections and connection errors, reduc-
ing latency and considering other optimizations like using the most appropriate storage service for
your data. Later in the chapter, you learn how to scale up SQL Database using SQL Database Premium.
The last sections of the chapter guide you through scaling your SQL Database out with a partitioning
strategy known as sharding.
Entity Framework is a Microsoft data access framework that simpliies mapping database objects to
.NET objects. EF uses ADO.NET internally for data access, but the additional object relational mapping
that EF performs adds performance overhead. When optimizing data access performance, one com-
mon technique is to reduce the number of abstractions between your application code and data-
bases. Using lower-level data access technologies like ADO.NET directly often improves data access
performance. In addition to performance optimization, many existing applications use ADO.NET for
data access. Therefore, both ADO.NET and EF are discussed in this chapter.
Note This chapter uses EF and Web API to demonstrate performance concepts.
Chapter 10, “Building cloud solutions,” delves into much more detail on both of these
important technologies.
217
Achieving high performance in the cloud
Virtually ininite amounts of hardware and computing resources is one of the major value
propositions for moving to the cloud. While the quantity of computing resources owned and
managed by a cloud vendor is ultimately inite, at peak loads, most organizations would demand
only a small fraction of the available resources. However, most applications, especially typical enter-
prise applications, are not designed in a way that can make use of this large pool of hardware. Most
applications are not designed to scale horizontally (often called scaling out)—that is, to chunk up
and distribute the load across many servers and storage nodes. Instead, most applications depend
on having control of the hardware and scaling vertically (often called scaling up)—that is, to increase
the capacity and performance of centralized computing resources by purchasing larger and more
powerful servers and storage devices.
Cloud vendors such as Microsoft provide some ability to scale up. At the time of this writing,
Microsoft Azure compute instances range in size from a shared 1-GHz CPU with 768 MB of RAM to
a 16-by-2.6 GHz CPU instance with 112 GB of RAM. The Microsoft Azure Platform as a Service (PaaS)
services, such as SQL Database, also have some ability to scale up. For example, a single SQL Data-
base can range in size from a 100-MB database to 150 GB, and with the Preview availability of SQL
Database Premium, the computing resources for a single SQL Database server can be scaled. (You will
learn more about SQL Database Premium in the section “Scaling up SQL Database.”)
Microsoft Azure provides some capability to scale computing resources up; unfortunately, there
will always be a physical limit and upper bound to the amount of computing capacity you can get
from a single resource, whether that’s a server, storage device, or service. To scale big, you must
scale out. And to scale out, your applications must be architected in a way that allows them to be
distributed across multiple instances of computing hardware.
In addition to scalability, because you don’t control the hardware coniguration that SQL Database
is running on, you can’t scale the server hardware up with the same control as you could in your
own data center, and because SQL Database is a multitenant service with multiple customers sharing
the same physical compute resources, the performance characteristics will quite likely be different
than that of your own private data center. As a result, you might have to tune database performance
differently than you would in your own data center.
Note In other chapters, the WineCloudDb database included Wine, Customer, and Order
tables. This chapter does not make use of the Order table, which is why it is omitted here.
LISTING 8-1 Script to create the sample WineCloudDb database for the Web APIs
This script creates the WineCloudDb tables and loads some sample data into them. Once you
create this sample data, you will build Web APIs that return the wine and customer data.
1. From the Windows Start screen, launch SQL Server Management Studio (SSMS). You can
either scroll through the app tiles to ind it (in the Microsoft SQL Server 2012 category) or just
type sql server management studio to run a search, and then click on the tile. After a brief
moment, the Connect To Server dialog appears.
CHAPTER 8 Designing and tuning for scalability and high performance 219
2. In the Connect To Server dialog, do the following:
b. For Authentication, select SQL Server Authentication from the drop-down list. (SQL
Database does not support Windows Authentication.)
c. For Login and Password, type the user name and password you assigned the server when
you created it.
4. If the WineCloudDb database exists from a previous chapter, delete it now by doing the
following:
a. In the Object Explorer, right-click the server name and choose New Query to open a new
query window connected to the master database.
c. Press F5 (or click the Execute button in the toolbar) to create the database.
6. In the Object Explorer, right-click the Databases node and choose Refresh. The WineCloudDb
database you just created should now appear.
7. Right-click the WineCloudDb database, and choose New Query to open a new query window
connected to the WineCloudDb database.
8. Type the code shown in Listing 8-1 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
9. Press F5 (or click the Execute button in the toolbar) to create the database schema and
populate some data.
You now have a new WineCloudDb database that will serve as the data source for your Web APIs.
2. Click the FILE menu, and then choose New | Project to display the New Project dialog.
3. On the left of the New Project dialog, expand Templates, Other Project Types, and choose
Visual Studio Solutions.
4. Select the Blank Solution template, name the solution WineSolution, and choose any desired
location for the solution, as shown in Figure 8-1.
FIGURE 8-1 The New Project dialog for creating the Blank Solution
CHAPTER 8 Designing and tuning for scalability and high performance 221
The Solution Explorer now shows the new WineSolution. (If the Solution Explorer is not visible, click
the VIEW menu and choose Solution Explorer.) Now that you have an empty solution, you’re ready to
create a new ASP.NET Web API project.
1. Right-click WineSolution in Solution Explorer, and choose Add | New Project to display the Add
New Project dialog.
2. On the left of the New Project dialog, expand Installed, expand Visual C#, and choose Web.
3. Choose the ASP.NET Web Application template, which is typically selected by default.
4. Name the project WineCloudWebApi and click OK, as shown in Figure 8-2, to display the
New ASP.NET Project dialog.
5. Choose the Web API template and click OK, as shown in Figure 8-3.
You have now created an empty ASP.NET Web Application project with references to the ASP.NET
Web API assemblies. The project also references the ASP.NET MVC assemblies, which makes it possible
to run both Web API and MVC applications in the same ASP.NET Web Application project.
In earlier versions of Entity Framework, the entity model and database mappings could be
conigured only in an EDMX ile, typically using the graphical Entity Data Model (EDM) designer in
Visual Studio. With the Code First feature added in EF 4, you can choose instead to create your own
Plain Old CLR Object (POCO) classes and conigure the database conventions and mapping in code.
This approach provides a number of advantages over using EDMX iles. One of the most signiicant
advantages is loose coupling of models from the persistence framework. This makes it easier to test
your entity models in isolation from your database, add additional properties and methods, and use
them across multiple application layers.
Creating a POCO class for use with Entity Framework is quite simple, and there are just a few rules
you must follow. First, the class needs to be public. Second, the properties need to be public. And last,
the data types of the model need to match the columns in your table. Entity Framework Code First is
convention based, which means a number of patterns are built into Entity Framework. If these pat-
terns are followed in your entity model and database design, Entity Framework can automatically map
your entity model to the database without any manual coniguration. The entity model for the Wine
table, as shown in Listing 8-2, contains properties that match the columns in the Wine table, but there
CHAPTER 8 Designing and tuning for scalability and high performance 223
are no dependencies on Entity Framework. Entity Framework knows the WineId property maps to the
primary key of the Wine table, because the naming follows the convention of class name followed by
“Id” and the data type of the property is numeric or a GUID. Listing 8-3 contains the context class you
will customize to override EF’s default pluralization naming strategy.
More Info You can learn more about the Entity Framework conventions at
http://msdn.microsoft.com/en-us/data/jj679962.aspx.
By default, EF will create a database and tables using the Entity Framework conventions and the
POCO entity model classes. However, this database initialization strategy can be easily disabled so
that you get to enjoy a combination of both database-irst and code-irst experiences with EF.
That is, you don’t need to let EF “reverse-engineer” a database from your code if you want to
use code irst; you can still create the database on your own and use code irst at the same time.
It’s easy to tell Entity Framework not to create the database and tables by calling
Database.SetInitializer<WineDbContext>(null) at the end of the Application_Start method in
the Global.asax.cs.
One of the default Entity Framework Code First conventions is to pluralize table names. This means
that, by default, Entity Framework expects that Wine entities are stored in a table named Wines
(plural). However, the WineCloudDb database has a Wine table (singular), not a Wines table. Again, it’s
easy to override this default behavior by overriding the EF context’s OnModelCreating method and
calling modelBuilder.Conventions.Remove<PluralizingTableNameConvention>(). Removing this default
convention will match the entity class name to a table name, which will map the Wine entity POCO
class to the same-named Wine table in the database.
using System;
namespace WineCloudWebApi.Models
{
public class Wine
{
public int WineId { get; set; }
public string Name { get; set; }
public string Category { get; set; }
public int? Year { get; set; }
public Decimal? Price { get; set; }
}
}
using System.Data.Entity;
using System.Data.Entity.ModelConfiguration.Conventions;
namespace WineCloudWebApi.Models
{
public class WineDbContext : DbContext
{
public WineDbContext() : base(“name=WineDbContext”) {}
To create a Web API using Entity Framework for the Wine table, follow these steps:
1. Right-click the Models folder beneath the WineCloudWebApi project in the Solution Explorer,
and choose Add | Class to display the Add New Item dialog.
2. Name the class Wine.cs and click Add, as shown in Figure 8-4.
3. Replace the template code generated automatically by Visual Studio with the code shown in
Listing 8-2.
CHAPTER 8 Designing and tuning for scalability and high performance 225
4. Build the WineSolution by selecting Build | Build Solution in the menu at the top or pressing
Ctrl+Shift+B.
Note The next step in this procedure is to add a new Web API controller using
the Wine model class you just created. Visual Studio inds the model classes using
the built assemblies within the solution. This requires you to build the solution
now so that Visual Studio can ind the model class in the next step.
5. Right-click the Controllers folder in the WineCloudWebApi project in the Solution Explorer,
and choose Add | Controller to display the Add Scaffold dialog.
6. Choose the Web API 2 Controller With Actions, Using Entity Framework controller, and click
Add, as shown in Figure 8-5. This scaffold automatically creates Web API methods in the new
controller class that retrieve (GET) and update (PUT, POST, DELETE) entities.
7. In the Add Controller dialog, supply the information for the new Wine controller by doing the
following:
c. For the Model class, select Wine (WineCloudWebApi.Models) from the drop-down list.
d. For the Data context class, click the New Data Context button and type
WineCloudWebApi.Models.WineDbContext in the New Data Context Type text box
and click Add, as shown in Figure 8-6.
e. The Add Controller dialog should now appear as shown in Figure 8-7. Click Add to create
the controller.
FIGURE 8-7 Adding and coniguring the WineController class with Entity Framework
8. Double-click the Web.conig ile beneath the WineCloudWebApi project in the Solution
Explorer, and locate the WineDbContext connection string setting in the connectionStrings
section.
a. Replace <servername> with the name of the SQL Database server that contains the
WineCloudDb database.
b. Replace <username> and <password> with the user name and password you assigned
the server when you created it.
10. Double-click the WineDbContext.cs ile beneath the Models folder in the Solution Explorer
and replace the template code with the code in Listing 8-3. The code adds an override of the
OnModelCreating method that removes the default pluralizing table convention. This tells
EF to look for a table named Wine (singular) and not Wines (plural) for storing rows of Wine
entities in the database.
CHAPTER 8 Designing and tuning for scalability and high performance 227
11. Double-click the Global.asax.cs ile beneath the WineCloudWebApi project in the Solution
Explorer.
12. Make the following changes to remove the default database initialization strategy, and tell EF
not to attempt to create the database (because the database already exists):
a. Add the following two using statements at the top of the ile:
using System.Data.Entity;
using WineCloudWebApi.Models;
b. Add the following line of code at the end of the Application_Start method:
Database.SetInitializer<WineDbContext>(null);
13. Build the WineSolution by selecting Build | Build Solution in the menu at the top or pressing
Ctrl+Shift+B.
You have now created a RESTful Web API for the Wine table using ASP.NET Web API and Entity
Framework, and you’re ready to test it.
Note You can learn more about Entity Framework and ASP.NET Web API in Chapter 10. In
Chapter 10, you will learn how to build a multitier web and mobile application in Microsoft
Azure using SQL Database.
1. Select the WineSolution in the Solution Explorer, and press F5 or click Debug | Start
Debugging. This opens Internet Explorer (or your default debugging web browser) at the
default page of the WineCloudWebApi project, as shown in Figure 8-8.
2. Append the URL in the browser’s address bar with api/Wine, and press Enter. This executes
the GetWines method on the WineController and responds with the list of wines from the
WineCloudDb database. Internet Explorer’s default behavior asks if you would like to save or
open the results from the Web API call, as shown in Figure 8-9.
CHAPTER 8 Designing and tuning for scalability and high performance 229
Note ASP.NET Web API implements a feature called content negotiation. The
HTTP speciication (RFC 2616) deines content negotiation as “the process of
selecting the best representation for a given response when there are multiple
representations available.” The most common way to handle content negotia-
tion is with HTTP Accept request headers. Browsers have different default Accept
headers. If you make a GET request to the Wine API in multiple browsers, you are
likely to get a different response. In Internet Explorer, you will often get a JSON-
formatted response; in Chrome, you will often get an XML-formatted response.
3. Click Open to view the list of wines returned in the JSON results. (If prompted for how to open
this type of ile, click More Options and choose Notepad.)
4. Append the URL in the browser’s address bar with /2, and press Enter. This executes the
GetWine method on the WineController and responds with the Wine record for WineId 2.
You have now created a Wine Web API using ASP.NET Web API and Entity Framework Code First.
You have also tested it in the browser and veriied that it returns results from the WineCloudDb data-
base. Next you will add a Web API Controller for the Customer table using raw ADO.NET rather than
Entity Framework. It is common to have enterprise applications that use ADO.NET because they were
developed prior to the introduction of Entity Framework. Raw ADO.NET is also commonly used when
trying to boost data-access performance. As a result, in the following section you will use ADO.NET to
build a Web API controller for the Customer table.
namespace WineCloudWebApi.Models
{
public class Customer
{
public int CustomerId { get; set; }
public string FirstName { get; set; }
public string LastName { get; set; }
public int? FavoriteWineId { get; set; }
}
}
using System;
using System.Collections.Generic;
using System.Configuration;
using System.Data.SqlClient;
using System.Web.Http;
using System.Web.Http.Description;
using WineCloudWebApi.Models;
namespace WineCloudWebApi.Controllers
{
public class CustomerController : ApiController
{
// GET api/Customer
public IList<Customer> GetCustomers()
{
IList<Customer> customers = new List<Customer>();
var connectionString =
ConfigurationManager.ConnectionStrings[“WineDbContext”].ConnectionString;
// GET api/Customer/5
[ResponseType(typeof(Customer))]
public IHttpActionResult GetCustomer(int id)
{
Customer customer = null;
var connectionString =
ConfigurationManager.ConnectionStrings[“WineDbContext”].ConnectionString;
CHAPTER 8 Designing and tuning for scalability and high performance 231
using (var connection = new SqlConnection(connectionString))
{
var commandText = “SELECT * FROM Customer WHERE CustomerId = @CustomerId”;
using (var command = new SqlCommand(commandText, connection))
{
command.Parameters.AddWithValue(“@CustomerId”, id);
connection.Open();
using (var reader = command.ExecuteReader())
{
if (reader.Read())
{
customer = new Customer
{
CustomerId = Convert.ToInt32(reader[“CustomerId”]),
FirstName = reader[“FirstName”].ToString(),
LastName = reader[“LastName”].ToString(),
FavoriteWineId = reader[“FavoriteWineId”] as int?
};
}
}
}
}
if (customer == null)
{
return NotFound();
}
return Ok(customer);
}
}
}
1. Right-click the Models folder in the WineCloudWebApi project in the Solution Explorer, and
choose Add | Class to display the Add New Item dialog.
3. Replace the template code generated automatically by Visual Studio with the code shown in
Listing 8-4 earlier, and build the WineSolution by selecting Build | Build Solution in the menu
at the top or pressing Ctrl+Shift+B.
4. Right-click the Controllers folder in the WineCloudWebApi project in the Solution Explorer,
and choose Add | Controller to display the Add Scaffold dialog.
5. Choose the Web API 2 Controller - Empty controller, and click Add. This displays the Add
Controller dialog.
7. Replace the template code with the code shown in Listing 8-5 earlier, and build the
WineSolution by selecting Build | Build Solution in the menu at the top or pressing
Ctrl+Shift+B.
You have now added a new ASP.NET Web API controller for the Customer table using ADO.NET
instead of Entity Framework. The Customer Web API provides methods to get customer data, but it
doesn’t provide methods for adding or updating customers. Similar to the Wine entity model, the
Customer entity model, as shown in Listing 8-4, doesn’t have any data-access dependencies and is
a simple POCO class. The Customer controller, as shown in Listing 8-5, queries the Customer table
using the ADO.NET SqlConnection, SqlCommand, and SqlDataReader classes. It creates and populates
instances of the Customer entity model class and returns the Customer objects. ASP.NET Web API
then serializes the Customer objects into the appropriate response formats and sends them back to
the requestor.
1. Select the WineSolution in the Solution Explorer, and press F5 or click Debug | Start
Debugging. This opens Internet Explorer or your default debugging web browser at the
default page of the WineCloudWebApi project.
2. Append the URL in the browser’s address bar with api/Customer, and press Enter. This
executes the GetCustomers method on the WineController and responds with the list of
Customers from the WineCloudDb database. Internet Explorer’s default behavior asks if you
would like to save or open the results from the Web API call.
4. Append the URL in the browser’s address bar with /3, and press Enter. This executes the
GetCustomer method on the CustomerController and responds with the Customer record for
CustomerId 3.
You have now set up an ASP.NET Web API project to get data from the Wine and Customer tables
in the WineCloudDb database. Your project has Web APIs that use both Entity Framework Code First
and raw ADO.NET. With this project now in place, we will build upon it in the following sections of this
chapter to demonstrate performance tuning and scalability with SQL Database.
CHAPTER 8 Designing and tuning for scalability and high performance 233
Managing SQL Database connections
Database connections in Microsoft Azure SQL Database can have different characteristics and
behaviors than you’ve likely experienced with SQL Server on-premises. To provide reliability and great
user experiences, these behaviors must be accounted for in your applications that use SQL Database.
In addition to these different connection behaviors, you should also implement general best practices
for interacting with databases, regardless of whether they are on-premises or in the cloud.
Pooling connections
There is overhead and a performance penalty when establishing new database connections. To help
minimize this performance penalty, ADO.NET can pool connections. Connection pooling works by
keeping a client’s connections open in a pool of managed connections. When the client needs to
open a connection, ADO.NET checks for an existing connection in the pool with a connection string
that matches. If a connection already exists, it returns that connection to the client to interact with the
database. If a connection doesn’t exist, a new connection is established. When the client is inished
with the connection and closes the connection, instead of destroying the connection, it is returned
to the pool. Only connections with the exact same coniguration and connection strings can be
pooled, and the connection pool is scoped to an AppDomain. By default, ADO.NET uses connection
pooling, and it is a generally recommended best practice to dramatically reduce the cost of opening
connections.
Not all terminated connections are caused by transient errors. However, errors raised by SQL
Database return an error number you can use to determine what type of error occurred. Your
application can compare the error number to the known transient error numbers. If there’s a match,
your application can try to reconnect and retry the transaction; if there’s not a match, the error is not
temporary and will need to be handled differently. If that sounds like a lot of work, don’t worry—the
Microsoft Azure Customer Advisory Team, in conjunction with Microsoft Patterns & Practices, has
simpliied this by creating the Transient Fault Handling Application Block. This library is part of the
Enterprise Library family of application blocks, and it can be integrated into your application to sim-
plify the recovery during transient faults. It can be used with SQL Database and other Microsoft Azure
services that might have transient errors, including Service Bus and Storage.
In the following sections, you will integrate the Transient Fault Handling Application Block into the
WineCloudWebApi project. Speciically, you’ll modify the Wine and Customer Web APIs to recover
gracefully from transient fault conditions.
Follow these steps to add a reference to the Transient Fault Handling Application Block:
1. Right-click the WineCloudWebApi project in the Solution Explorer, and choose Manage NuGet
Packages to display the Manage NuGet Packages dialog.
2. Choose Online on the left, type transient fault sql database in the search box in the upper
right corner, and press Enter.
3. Select Enterprise Library - Transient Fault Handling Application Block - Microsoft Azure SQL
Database Integration from the search results, and click Install as shown in Figure 8-10.
CHAPTER 8 Designing and tuning for scalability and high performance 235
FIGURE 8-10 Installing the Transient Fault Handling Application Block using the Manage NuGet
Packages dialog
4. In the License Acceptance dialog, click I Accept as shown in Figure 8-11. This downloads and
adds a reference to Microsoft.Practices.EnterpriseLibrary.TransientFaultHandling.Data.dll and its
dependencies.
FIGURE 8-11 Accepting the license for the Transient Fault Handling Application Block
You have now added a reference to the Transient Fault Handling Application Block using NuGet
and are now ready to get started integrating it into the WineCloudWebApi project. In the next two
sections, you will modify the Wine and Customer Web APIs to recover from transient errors.
TABLE 8-1 Major components of the Transient Fault Handling Application Block
Component Description
Retry strategy Deines how often and how many times to retry when a fault is identiied as a
transient fault
Retry policy Combines a detection strategy and a retry strategy, and is used to call services that
might encounter transient faults
Detection strategies are used by the Transient Fault Handling Application Block to determine if
an error is transient and whether the failed service call should be retried. Detection strategies are
classes that implement the ITransientErrorDetectionStrategy interface, and the Transient Fault Han-
dling Application Block has implementations for SQL Database, Service Bus, Storage, and Caching.
The procedures in this chapter use the SqlDatabaseTransientErrorDetectionStrategy class to detect SQL
Database transient error conditions, but you can implement your own detection strategy using the
ITransientErrorDetectionStrategy interface.
Retry strategies are used to deine the frequency and number of times the Transient Fault Handling
Application Block will retry and attempt to automatically recover from a transient error. Retry strate-
gies are classes that derive from the RetryStrategy class. Table 8-2 shows the retry strategies that are
implemented in the Transient Fault Handling Application Block.
TABLE 8-2 Retry strategies in the Transient Fault Handling Application Block
Class Description
ExponentialBackoff Retries a speciied number of times, exponentially delaying retries based on speciied
back-off parameters
FixedInterval Retries a speciied number of times with a ixed interval between each retry
Incremental Retries a speciied number of times with an incrementing interval between each retry
Using a detection strategy and a retry strategy, you create a retry policy using the RetryPolicy
class. The retry policy will inspect errors that occur using the detection strategy and execute the retry
strategy if an error is identiied as a transient error. The Transient Fault Handling Application Block
provides multiple ways for you to integrate it into your applications. You can use new classes that
encapsulate existing classes to make them transient-fault and retry aware. ReliableSqlConnection is
a class that encapsulates SqlConnection. You can use it in place of SqlConnection to automatically
handle retrying when transient faults occur. There are also extension methods for existing classes,
including the SqlConnection and SqlCommand classes. These extension methods can be used to utilize
retry policies, as shown in the following procedures.
CHAPTER 8 Designing and tuning for scalability and high performance 237
To handle transient faults in the GetCustomers method, follow these steps:
1. Expand the Controllers folder beneath the WineCloudWebApi project in Solution Explorer.
2. Double-click CustomerController.cs, and add the following using statement to the list of using
statements at the top:
using Microsoft.Practices.EnterpriseLibrary.TransientFaultHandling;
3. Locate the GetCustomers method, and type the following two lines of code just below the
assignment of the connectionString variable:
var retryStrategy =
new Incremental(5, TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(2));
var retryPolicy =
new RetryPolicy<SqlDatabaseTransientErrorDetectionStrategy>(retryStrategy);
1. Locate the GetCustomer method, and type the following two lines of code just below the
assignment of the connectionString variable:
var retryStrategy =
new Incremental(5, TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(2));
var retryPolicy =
new RetryPolicy<SqlDatabaseTransientErrorDetectionStrategy>(retryStrategy);
You are now handling transient fault conditions that might occur when using SQL Database using
the Incremental retry strategy in the Customer Web API. The Incremental retry strategy works with
three parameters.
The irst parameter deines the number of times to retry. In the preceding procedure, you
conigured the Incremental retry strategy to retry ive times. The second and third parameters
together deine the amount of time between each retry attempt. The second parameter deines the
time to wait before the irst retry, and the third parameter deines the amount of time to add to each
subsequent retry attempt. In the preceding procedure, you set the second parameter to one second
and the third parameter to two seconds. This will retry after one second, three seconds, ive seconds,
seven seconds, and nine seconds.
By deining a retry policy using the Transient Fault Handling Application Block, and by using
extension methods on the SqlConnection and SqlCommand classes provided by the Transient Fault
Handling Application Block, the Customer Web API is now resilient to transient error conditions, and
In the following procedure, you will use the same RetryPolicy class you used earlier with the
extension methods on the SqlConnection and SqlCommand classes. The RetryPolicy class has an
ExecuteAction method that is designed to execute code that encapsulates ADO.NET. This is use-
ful when working with object relational mapping (ORM) and data-access platforms such as Entity
Framework. Listing 8-6 shows a modiied WineController class with Entity Framework calls that result
in queries to SQL Database surrounded by the ExecuteAction method. The manual changes you
need to apply to the code generated automatically by the scaffolding is indicated in bold type. Any
exceptions that bubble up to the RetryPolicy will be inspected for transient error conditions and
retried appropriately.
LISTING 8-6 The WineController.cs class using the Transient Fault Handling Application Block to handle transient
error conditions
using Microsoft.Practices.EnterpriseLibrary.TransientFaultHandling;
using System;
using System.Collections.Generic;
using System.Data;
using System.Data.Entity;
using System.Data.Entity.Infrastructure;
using System.Linq;
using System.Net;
using System.Net.Http;
using System.Web.Http;
using System.Web.Http.Description;
using WineCloudWebApi.Models;
namespace WineCloudWebApi.Controllers
{
public class WineController : ApiController
{
private WineDbContext db = new WineDbContext();
private RetryPolicy<SqlDatabaseTransientErrorDetectionStrategy> _retryPolicy;
CHAPTER 8 Designing and tuning for scalability and high performance 239
public WineController()
{
var retryStrategy =
new Incremental(5, TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(2));
_retryPolicy =
new RetryPolicy<SqlDatabaseTransientErrorDetectionStrategy>(retryStrategy);
}
// GET api/Wine
public IQueryable<Wine> GetWines()
{
IQueryable<Wine> wines = null;
_retryPolicy.ExecuteAction(() =>
{
wines = db.Wines;
});
return wines;
}
// GET api/Wine/5
[ResponseType(typeof(Wine))]
public IHttpActionResult GetWine(int id)
{
Wine wine = null;
_retryPolicy.ExecuteAction(() =>
{
wine = db.Wines.Find(id);
});
if (wine == null)
{
return NotFound();
}
return Ok(wine);
}
// PUT api/Wine/5
public IHttpActionResult PutWine(int id, Wine wine)
{
if (!ModelState.IsValid)
{
return BadRequest(ModelState);
}
if (id != wine.WineId)
{
return BadRequest();
}
db.Entry(wine).State = EntityState.Modified;
try
{
_retryPolicy.ExecuteAction(() =>
return StatusCode(HttpStatusCode.NoContent);
}
// POST api/Wine
[ResponseType(typeof(Wine))]
public IHttpActionResult PostWine(Wine wine)
{
if (!ModelState.IsValid)
{
return BadRequest(ModelState);
}
db.Wines.Add(wine);
_retryPolicy.ExecuteAction(() =>
{
db.SaveChanges();
});
// DELETE api/Wine/5
[ResponseType(typeof(Wine))]
public IHttpActionResult DeleteWine(int id)
{
Wine wine = null;
_retryPolicy.ExecuteAction(() =>
{
wine = db.Wines.Find(id);
});
if (wine == null)
{
return NotFound();
}
CHAPTER 8 Designing and tuning for scalability and high performance 241
db.Wines.Remove(wine);
_retryPolicy.ExecuteAction(() =>
{
db.SaveChanges();
});
return Ok(wine);
}
return wineExists;
}
}
}
To handle transient faults in the Wine controller class, follow these steps:
1. Expand the Controllers folder beneath the WineCloudWebApi project in Solution Explorer.
3. Update the code as indicated by the bolded sections of Listing 8-6. Speciically:
d. Wrap each line of code that invokes a query or calls SaveChanges inside the retry policy’s
ExecuteAction method.
You have now handled transient error conditions using both ADO.NET and Entity Framework. The
Transient Fault Handling Application Block is a huge help in doing this in a simple, quick, and suc-
cinct way. Without it, you would have to write the code to identify when an error is transient and
temporary by comparing error numbers from SQL exceptions to a list of known transient errors. You
Transient error conditions are not something you should ignore. If your application does not
account for these conditions, your application could experience dropped connections and throttling,
resulting in errors and bad experiences for your users, when it could be avoided by retrying and con-
tinuing on. This is a common area of frustration and source of reliability issues for new users of SQL
Database, and it can be avoided at the outset by integrating the Transient Fault Handling Application
Block into your applications from the start.
These are some things you can do to reduce network round trips to SQL Database:
■ Encapsulate complex data access in stored procedures if the data access results in multiple
round trips and queries to SQL Database. For example, if a query depends on results from
CHAPTER 8 Designing and tuning for scalability and high performance 243
a previous query, instead of making multiple rounds trips to the database from the client,
combining those in a single stored procedure will reduce the network latency.
■ Use client-side storage and caching to reduce network trafic when retrieving lookup
data and data that changes infrequently. In addition to client-side storage, you can also use
distributed caching services like Microsoft Azure Cache to keep data in memory and share the
cached data across multiple nodes.
■ Avoid retrieving metadata at runtime to reduce roundtrips. This includes using classes like
SqlCommandBuilder that query metadata at runtime.
When you consider performance, scalability, and cost, relational databases are usually not the best
places to store unstructured, nonrelational, and binary data. (The FILESTREAM feature in SQL Server
represents an exception to this rule of thumb, but FILESTREAM, unfortunately, is not supported in SQL
Database.) One of the ways you can increase the performance of SQL Database is to use it to store
and serve relational data and only relational data, and use one of the alternative storage services to
manage unstructured or semistructured nonrelational data.
Blob storage is a massively scalable ile server service in Microsoft Azure. It is intended to store
binary iles, including documents and media. If you are storing binary large objects (BLOBs) in a data-
base using the varbinary(max) data type (again, FILESTREAM is not supported), Blob storage is a good
alternative solution for storing that data. Using Blob storage will reduce the load on SQL Database
and will also reduce the amount of data stored in SQL Database. In addition to having performance
beneits, Blob storage is signiicantly cheaper than SQL Database.
Queue storage is a service designed for queuing messages for asynchronous processing. This is
often done using a database table with status columns. You can reduce load and contention on SQL
Database by using Queue storage for these scenarios.
When data is accessed frequently and changed infrequently, that data is a great candidate for
caching. Microsoft Azure provides a Cache service that can be deployed to your own compute
instances in Microsoft Azure, or you can consume a managed Cache service. In simple cases, storing
and accessing data from the local ile system might meet your needs.
Optimizing queries
Many performance-tuning techniques and principles for SQL Database are the same as SQL Server in
your own data center. In SQL Database, queries must be optimized and tuned just like in SQL Server
on-premises. Query tuning is a very big topic and is outside the scope of this book; however, there is
already a lot of good content written on this topic in other books. The basic principles of analyzing
execution plans, optimizing queries to reduce disk I/O and wait time, and tuning indexes are the same
as in SQL Server. However, identifying slow queries and queries that are good candidates for optimi-
zation is a little different than it is in SQL Server. SQL Database does not contain SQL Server function-
ality that requires elevated permissions or has the potential to impact the performance and reliability
of other tenants; therefore, SQL Proiler cannot be used with SQL Database. Knowing how to iden-
tify queries that are slow and can be improved in SQL Database is very valuable and is discussed in
Chapter 9, “Monitoring and managing SQL Database.”
CHAPTER 8 Designing and tuning for scalability and high performance 245
TABLE 8-3 Service levels of SQL Database Premium
P1 1 CPU core, 8 GB of memory and 150 disk input/output operations per second (IOPS).
Note Microsoft regularly releases new features for Microsoft Azure. These features often
begin as Preview services, which don’t come with service level agreements (SLAs) or
warranties, and they are typically not supported by Microsoft customer support. While in
Preview, services often have reduced pricing (and sometimes are free). When a service pro-
gresses from Preview to General Availability (GA), the service gets service level agreements,
warranties, support, and full pricing. For more information about Microsoft Azure preview
services, visit http://azure.microsoft.com/en-us/services/preview/.
To sign up for the SQL Database Premium Preview, follow these steps:
2. Click the Preview Features button as shown in Figure 8-12 to display the Microsoft Azure
preview features.
FIGURE 8-12 The Preview Features link in the Microsoft Azure account portal
3. Click the Try It Now button to the right of Premium For SQL Database, as shown in
Figure 8-13, to activate the SQL Database Premium preview.
4. Choose the subscription you want to use with SQL Database Premium in the Add Preview
Feature dialog, as shown in Figure 8-14, and click the check mark in the bottom right corner.
FIGURE 8-14 Selecting the subscription in the Add Preview Feature dialog
You have now requested to be signed up for the SQL Database Premium preview. Requests are
queued and approved based on current capacity and demand. You will receive an email informing
you when your request has been approved and SQL Database Premium preview has been activated
in your subscription. Once the preview has been activated, you can then request a SQL Database
Premium quota for your SQL Database servers.
CHAPTER 8 Designing and tuning for scalability and high performance 247
To request a SQL Database Premium quota, follow these steps:
3. Click the SERVERS link at the top of the page. This displays a list of your Microsoft Azure SQL
Database servers.
4. In the NAME column, click the server that you want to request a Premium database quota for.
5. Navigate to the server home screen by clicking on the cloud with the lightning bolt icon.
6. Click Request Premium Database Quota in the Premium Database section as shown in
Figure 8-15.
FIGURE 8-15 Requesting a SQL Database Premium quota in the Microsoft Azure management portal
You have now requested to have a SQL Database Premium quota added to your SQL Database
server. Requests are queued and approved based on current capacity and demand. The status of a
Premium Database Quota request is displayed in the Premium Database section of the server home
screen where you initiated the request. After you initiate the request and prior to it being approved,
a Pending Approval message will be displayed, as shown in Figure 8-16. You can cancel the request
by clicking the Cancel Request link. When your request is approved and you have a SQL Database
Premium quota on your server, you will also receive an email notiication. Once you have a SQL
Database Premium quota, you can create a new database or associate an existing database with your
SQL Database Premium quota.
2. In the NAME column, click the database you want to upgrade to SQL Database Premium. This
database must be in your SQL Database server with an approved SQL Database Premium
quota.
3. Click the SCALE link at the top of the page, and conigure SQL Database Premium:
a. For Edition, select Premium. This upgrades your database from the Web and Business
editions to the SQL Database Premium edition.
b. For Reservation Size, you can select either P1 or P2. P1 is 1 core, 8 GB of memory and 150
IOPS. P2 is equivalent to two P1s. The page should appear similar to Figure 8-17.
CHAPTER 8 Designing and tuning for scalability and high performance 249
FIGURE 8-17 Selecting the SQL Database Premium reservation size
You have now scaled up your SQL Database using the preview of SQL Database Premium. With
SQL Database Premium, you reserve a portion of the SQL Database server compute resources just
for your database. This resource reservation provides consistent and predictable performance
characteristics for a SQL Database.
Partitioning data
Scaling relational databases is not a trivial task. Stateless application and web servers are typically
simple to scale. By deploying a new server and adding it to a load balancer, you instantly add capac-
ity and scale your application. But the relationship characteristic and the ACID (Atomic, Consistent,
Isolated, and Durable) properties of relational databases make it dificult to distribute a database
across multiple compute nodes. If you need to scale a relational database across multiple compute
nodes, you must partition the data. The following sections will describe approaches and techniques
for partitioning databases.
To partition your data functionally, you need to remove database-enforced relationships and
foreign keys between tables that will be in different databases. You can then split your tables into
multiple databases, but without the database-level constraints, your applications will need to enforce
the relationships between the tables that span databases. Any queries that join data across the tables
that have been split into multiple databases need to be split, and the database-connection strings
need to be updated to direct the application to the appropriate database. Splitting databases into
functional partitions can get complex and time consuming if there are lots of relationships between
tables, and lots of queries and applications that need to be updated.
One of the goals when partitioning data is to balance and equally distribute load across multiple
compute nodes. Although partitioning data by function can help split up the load across multiple
compute nodes, it is unlikely that your load will be distributed evenly across functional partitions.
In the next section, you’ll explore another data-partitioning technique call sharding. Sharding is a
technique that makes it easier to achieve an even load distribution.
In our WineCloudDb example, you will partition by a range of CustomerId values. The customer
with a CustomerId of 1 will go into one database shard, the customers with a CustomerId ranging
from 2 to 3 will go into another database shard. When sharding databases, you also need to have an
index or map of how your data is partitioned so that, irst, you can ind your records and, second, you
have a way to rearrange your records within the database shards if you ind that your distribution is
not equal and needs to be rebalanced. In the following example, the shard map is implemented in
code for simplicity and readability. Typically, though, you implement the shard map in a persistent
data store separate from your code.
Similar to functional partitions, it can be challenging to shard data when you have relationship
constraints and foreign keys. Often, reference and lookup data will get duplicated and stored in each
CHAPTER 8 Designing and tuning for scalability and high performance 251
shard. (The Microsoft Azure SQL Data Sync service can help you maintain multiple copies of reference
data tables across multiple shards; see Chapter 7, “Microsoft Azure SQL Data Sync,” for more informa-
tion.) In the WineCloudDb example, you will store the Wine table in each shard. If the customer table
has relationships to other customer-related tables, you typically partition those tables and store those
related records in the same database shard as the customer record. In the WineCloudDb example, the
Customer table doesn’t have additional customer-related tables, which makes sharding this database
a lot simpler.
1. From the Windows Start screen, launch SSMS. You can either scroll through the app tiles to
ind it (in the Microsoft SQL Server 2012 category) or just type sql server management
studio to run a search, and then click on the tile. After a brief moment, the Connect To Server
dialog appears.
c. For Login and Password, type the user name and password you assigned the server when
you created it.
a. In the Object Explorer, right-click the server name and choose New Query to open a new
query window connected to the master database.
c. Press F5 (or click the Execute button in the toolbar) to create the two databases.
4. In the Object Explorer, right-click the Databases node and choose Refresh. The
WineCloudDbShard1 and WineCloudDbShard2 databases you just created should now appear.
5. Right-click the WineCloudDbShard1 database, and choose New Query to open a new query
window connected to the WineCloudDbShard1 database.
6. Type the code shown in Listing 8-7 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
7. Press F5 (or click the Execute button in the toolbar) to create the database objects for the irst
shard, and then close the query window without saving the script.
8. Right-click the WineCloudDbShard1 database, and choose New Query to open a new query
window connected to the WineCloudDbShard1 database.
10. Press F5 (or click the Execute button in the toolbar) to add a customer row for Jeff Hay to the
WineCloudDbShard1 database, and then close the query window without saving the script.
11. Right-click the WineCloudDbShard2 database, and choose New Query to open a new query
window connected to the WineCloudDbShard2 database.
12. Type the code shown in Listing 8-7 into the query window (or paste it from the listing ile
downloaded from the book’s companion website).
CHAPTER 8 Designing and tuning for scalability and high performance 253
13. Press F5 (or click the Execute button in the toolbar) to create the database objects for the
second shard, and then close the query window without saving the script.
14. Right-click the WineCloudDbShard2 database, and choose New Query to open a new query
window connected to the WineCloudDbShard2 database.
16. Press F5 (or click the Execute button in the toolbar) to add two customer rows for Mark
Hanson and Jeff Phillips to the WineCloudDbShard2 database, and then close the query
window without saving the script.
You now have two new databases named WineCloudDbShard1 and WineCloudDbShard2 that will
serve as the data source for your customer Web APIs. Each database has the same reference data in
the Wine table, but the Customer table is partitioned horizontally between them.
Listing 8-8 shows the classes used for working with the individual shard databases. The Shard
class deines a shard database with the range of IDs that it will contain using the BeginId and the
EndId properties. The Shard class also has a ConnectionString property that has a connection string
for the database shard. The ShardRoot class represents the logical root that tracks the multiple shard
databases that collectively form the single logical database. The CustomerShard class is the logical
root implementation for the WineCloudDb database shards. It uses the Shard and ShardRoot classes
to collect the multiple customer databases into one logical container. The GetShard method on the
CustomerShard class makes it easy to retrieve the database shard for a speciied customer.
using System.Collections.Generic;
using System.Linq;
namespace WineCloudWebApi.Data
{
public class Shard
{
public int Id { get; set; }
public int BeginId { get; set; }
public int EndId { get; set; }
public string ConnectionString { get; set; }
}
public CustomerShard()
{
ShardRoot = new ShardRoot();
ShardRoot.Shards.Add(new Shard
{
Id = 1,
BeginId = 1,
EndId = 1,
ConnectionString =
“Server=tcp:<ServerName>.database.windows.net,1433;” +
“Database=WineCloudDbShard1;User ID=<UserName>@<ServerName>;” +
“Password=<Password>;Trusted_Connection=False;” +
“Encrypt=True;Connection Timeout=30;”
});
ShardRoot.Shards.Add(new Shard
{
Id = 2,
BeginId = 2,
EndId = 3,
ConnectionString =
“Server=tcp:<ServerName>.database.windows.net,1433;” +
“Database=WineCloudDbShard2;User ID=<UserName>@<ServerName>;” +
“Password=<Password>;Trusted_Connection=False;” +
CHAPTER 8 Designing and tuning for scalability and high performance 255
“Encrypt=True;Connection Timeout=30;”
});
}
The CustomerController shown in Listing 8-9 has been modiied to retrieve customers from the
multiple databases using the CustomerShard class. The changes to the GetCustomer method are
simple. Using the ID passed into the GetCustomer method, the database where that customer exists
is returned by the GetShard method on the CustomerShard class. Then, using the same logic as in the
previous CustomerController examples in this chapter, the customer table is queried and the Customer
is returned. The changes required to get a single record as needed for the GetCustomer method are
minimal.
However, the changes required for the GetCustomers method are a little more complex. Whenever
a database is sharded into multiple databases, querying data that spans the multiple databases can be
challenging. To query across the databases, you must query each database and merge the results. This
approach is commonly referred to as fan-out querying. If you have many databases, you do not want
to execute those queries in a series, waiting for the previous query to return, because that increases
response time and provides a slower experience for users. Instead, you want to execute those queries
simultaneously and aggregate the results as they are returned in parallel. The GetCustomers method
in Listing 8-9 shows a simple fan-out query implementation using the Task Parallel Library. In this
example, each database shard is queried for customers in parallel and the results from each query are
merged and returned as one list of customers.
using System;
using System.Collections.Generic;
using System.Data.SqlClient;
using System.Threading.Tasks;
using System.Web.Http;
using System.Web.Http.Description;
using Microsoft.Practices.EnterpriseLibrary.TransientFaultHandling;
using WineCloudWebApi.Data;
using WineCloudWebApi.Models;
namespace WineCloudWebApi.Controllers
{
public class CustomerController : ApiController
Parallel.ForEach(CustomerShard.Instance.ShardRoot.Shards,
new ParallelOptions
{ MaxDegreeOfParallelism = CustomerShard.Instance.ShardRoot.Shards.Count },
shard =>
{
var shardCustomers = new List<Customer>();
var connectionString = shard.ConnectionString;
var retryStrategy =
new Incremental(5, TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(2));
var retryPolicy =
new RetryPolicy<SqlDatabaseTransientErrorDetectionStrategy>(retryStrategy);
return customers;
}
// GET api/Customer/5
[ResponseType(typeof(Customer))]
public IHttpActionResult GetCustomer(int id)
{
Customer customer = null;
CHAPTER 8 Designing and tuning for scalability and high performance 257
{
var connectionString = customerShard.ConnectionString;
var retryStrategy =
new Incremental(5, TimeSpan.FromSeconds(1), TimeSpan.FromSeconds(2));
var retryPolicy =
new RetryPolicy<SqlDatabaseTransientErrorDetectionStrategy>(retryStrategy);
if (customer == null)
{
return NotFound();
}
return Ok(customer);
}
}
}
To use the database shards with the customer Web API, follow these steps:
2. Right-click the WineCloudWebApi project in Solution Explorer, and choose Add | New Folder.
4. Right-click on the newly created Data folder, and choose Add | Class to display the Add New
Item dialog.
a. Replace <ServerName> with the name of the SQL Database server that contains the shard
databases.
b. Replace <UserName> and <Password> with the user name and password you assigned
the server when you created it.
7. Expand the Controllers folder beneath the WineCloudWebApi project in Solution Explorer.
8. Double-click the CustomerController.cs, and replace the code with the code shown earlier in
Listing 8-9.
9. Build the WineSolution by selecting Build | Build Solution in the menu at the top or pressing
Ctrl+Shift+B.
You have now split the WineCloudDb database into multiple database shards, splitting on customer
records. You can test the customer API using the same steps from the “Testing the Customer Web
API” section.
Listing 8-8 and Listing 8-9 show a simple implementation of integrating database sharding into
your application. In Listing 8-8, there are classes to manage and group the sharded databases; in
Listing 8-9, the CustomerController has a simple implementation of fan-out querying using the Task
Parallel Library. This implementation works for scenarios that are small and not overly complex, but
in larger and more complex scenarios you will likely need additional capabilities to track database
shards and update data across shards. There are existing libraries you can use to help with these
additional capabilities. The Microsoft Azure Customer Advisory Team creates guidance, frameworks,
and reference applications based on what they have learned from real-world customer engagements.
Microsoft Azure CAT has developed a reference application called Cloud Service Fundamentals in
Windows Azure that includes a library named Microsoft.AzureCat.Patterns.Data.SqlAzureDalSharded
that contains classes that will help you when sharding relational databases. You can reuse this library
in your applications, and it is available at http://code.msdn.microsoft.com/windowsazure/
Cloud-Service-Fundamentals-4ca72649.
Note EF can also be used when sharding databases. Using the same approach shown in
Listing 8-9, you can use the Task Parallel Library to execute queries across multiple data-
bases in parallel. However, instead of using raw ADO.NET objects to access your database,
you use EF objects. You construct an EF DbContext for each database, instead of opening
a new SqlConnection. Using the DbContext object, you then query the appropriate DbSet
property using LINQ To Entities. Sharding with EF is possible, but if you are trying to boost
performance, the performance overhead that comes with using EF may encourage you to
use raw ADO.NET instead.
CHAPTER 8 Designing and tuning for scalability and high performance 259
Summary
This chapter introduced you to optimizing performance and scaling Microsoft Azure SQL Database.
You created an ASP.NET Web API that was used as a reference application throughout the chapter,
and you improved the reliability and performance of this Web API by managing database connections
and connection errors and reducing latency in Microsoft Azure. You then considered other optimiza-
tions, like using the most appropriate storage service for your data and query optimization. Later in
the chapter, you scaled up SQL Database using SQL Database Premium. And in the last section, you
scaled your SQL Database with partitioning and sharding strategies.
Designing highly available, high-performance, scalable systems is a very large topic. This chapter
provided an understanding of the most important concepts, principles, and techniques for achieving
high performance and scale with SQL Database, but there is a lot more for you to learn on your own.
A ny service you intend to use in a production application must provide monitoring and
management capabilities. Monitoring should provide insight into the health of the service and,
ultimately, the health of your application. So, in the irst half of this chapter, you will learn how to
monitor SQL Database using the management portal, the Service Dashboard, and built-in dynamic
management views and functions.
In previous chapters, you managed SQL Database using graphical user interface (GUI) tools such
as the Microsoft Azure management portal, SQL Database management portal, SQL Server Manage-
ment Studio (SSMS), and SQL Server Data Tools (SSDT). These GUI tools are convenient when you are
getting started, but as your applications mature and you move toward production, you identify pro-
cesses that are frequently repeated and GUIs become inconvenient. To save time and reduce human
error, you need to automate these processes. Thus, the second half of this chapter teaches you how
to manage SQL Database using the Microsoft Azure Service Management Application Programming
Interface (API).
For the exercises in this chapter, use the script shown in Listing 9-1 to create WineCloudDb and
populate it with a few wines and customers.
261
LISTING 9-1 Script to create the sample WineCloudDb database
1. From the Windows Start screen, launch SSMS. You can either scroll through the app tiles to
ind it (in the Microsoft SQL Server 2012 category) or just type sql server management
studio to run a search, and then click on the tile. After a brief moment, the Connect To Server
dialog appears.
b. For Authentication, select SQL Server Authentication from the drop-down list. (SQL
Database does not support Windows Authentication.)
c. For Login and Password, type the user name and password you assigned the server when
you created it.
a. In the Object Explorer, right-click the server name and choose New Query to open a new
query window connected to the master database.
c. Press F5 (or click the Execute button in the toolbar) to create the database.
6. In the Object Explorer, right-click the Databases node and choose Refresh. The WineCloudDb
database you just created should now appear.
7. Right-click the WineCloudDb database, and choose New Query to open a new query window
connected to the WineCloudDb database.
8. Type the code shown in Listing 9-1 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
9. Press F5 (or click the Execute button in the toolbar) to create the database schema and
populate some data.
You now have a new WineCloudDb database you can use for the exercises in this chapter.
Note It was necessary to create the database and populate it in two separate query
windows because SQL Database does not support the USE statement found in SQL Server
for switching the connection from the master database to the WineCloudDb database. See
Chapter 3, “Differences between SQL Server and Microsoft Azure SQL Database,” for more
information on differences between SQL Database and SQL Server.
Monitoring
SQL Database provides multiple options for monitoring the health and operations of your servers
and databases. In this section, you will learn how to monitor SQL Database using the Microsoft
Azure management portal, Microsoft Azure Service Dashboard, SQL Database management portal,
and dynamic management views and functions. Using a combination of the tools described in this
section, you will be able to get a comprehensive view of the health of your servers and databases in
SQL Database.
3. Click the SERVERS link at the top of the page. This displays a list of your Microsoft Azure SQL
Database servers.
4. In the NAME column, click the server that contains the WineCloudDb database. This opens a
page with links for the selected server.
5. Click the DASHBOARD link at the top of the page. This displays the SQL Database server usage
overview as shown in Figure 9-1.
In addition to showing server usage and quotas, the Microsoft Azure management portal displays
usage and operational metrics for each database. The database dashboard in the management portal
displays the allocated size of a database, the space that is currently used, and the remaining free
space.
1. If you closed the Microsoft Azure management portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
2. Click SQL DATABASES in the vertical navigation pane on the left. This displays a list of your
databases.
5. Scroll the page down a bit, and ind the “Usage Overview” section as shown in Figure 9-2. This
displays the used and available storage for your database.
By default, deadlocks, failed connections, and successful connections are displayed for the past
hour. You can change the reporting period from one hour to 24 hours, 7 days, or 14 days. You can
also toggle the chart between showing relative values, which display the actual values relative to each
other, or absolute values, which display the actual values relative to zero as displayed on the Y axis.
Both can be conigured using the drop-down lists above the chart on the right side. By clicking the
check mark to the left of each named metric above the chart, you can also show and hide the metrics
plotted on the chart. When you click the refresh button in the upper-right corner above the chart, the
metrics displayed on the chart are updated.
The monitor page displays the details for each metric and allows you to customize which metrics
are displayed in the details list and on the chart. In addition to the deadlocks, failed connections,
and successful connections (which are displayed by default), you can choose to display metrics for
connections that were blocked by the irewall, current database size, and throttled connections.
More Info For more information on the SQL Database irewall, see Chapter 2,
“Coniguration and pricing,” and Chapter 5, “Security and backup.” For more information
on connection management and throttled connections, see Chapter 8, “Designing and
tuning for scalability and high performance.”
1. If you closed the Microsoft Azure management portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
2. Click SQL DATABASES in the vertical navigation pane on the left. This displays a list of your
databases.
4. Click the MONITOR link at the top of the page. This displays the metrics chart at the top and
the list of metrics with their minimum, maximum, average, and total values in the list below, as
shown in Figure 9-4.
FIGURE 9-4 SQL Database Monitor page, with the chart and details for the conigured metrics
5. Click the ADD METRICS button at the bottom of the page to display the CHOOSE METRICS
dialog.
6. Select the Blocked By Firewall check box at the top of the list as shown in Figure 9-5.
7. Click the check mark in the lower-right corner to close the dialog.
You have now added the Blocked By Firewall metric to the list of metrics displayed at the bottom
of the Monitor page. By default, the newly added metrics are not added to the chart at the top. To
display the metric on the chart, click the gray circle to the left of the metric name in the list of metrics.
After you click the gray circle icon, the icon will be changed to a colored circle with a check mark and
the metric will be displayed in the chart, as shown in Figure 9-6.
FIGURE 9-6 A newly added metric in the list of metrics on the Monitor page of a SQL Database
Each Microsoft Azure service is listed on the Service Dashboard page. To the left of each service
name is an icon that represents the current state of each service. If the service has a green check mark
next to it, the service is operating normally. If it has an orange triangle warning icon, the performance
of the service is not normal and is currently running with degraded performance. If a red circle error
icon is displayed, the service is experiencing an outage. Clicking the plus sign to the left of the service
name expands the list of regions where the service is deployed, and you can view the health of the
service in each region. Clicking the RSS icon on the right side of each row displays an RSS feed with
a descriptive status history, as shown in Figure 9-8. You can subscribe to the Service Dashboard RSS
By default, the Service Dashboard automatically refreshes the service statuses every 10 minutes.
The refresh interval can be set to 1 minute, 2 minutes, 5 minutes, 10 minutes, or Off to disable the
automatic refresh. You can also ilter the displayed data center regions to show only the regions that
are relevant to you. When you use Microsoft Azure for managed production workloads, the Service
Dashboard is a website that can be displayed on a monitor in your operations center so that you can
stay informed of any service interruptions in Microsoft Azure.
Monitoring and troubleshooting query performance can be a dificult task. Tools that help you
identify bottlenecks and pinpoint improvement opportunities can help you be more effective when
optimizing queries. The SQL Database management portal provides tools to help you identity
ineficient queries.
1. If you closed the Microsoft Azure management portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
2. Click SQL DATABASES in the vertical navigation pane on the left. This displays a list of your
databases.
4. Click the MANAGE button at the bottom of the page to open the SQL Database management
portal.
5. Type the username (for example, saz) and password you assigned when you created the
server, and click Log On. This displays the Summary page for the database, as shown earlier in
Figure 9-9.
FIGURE 9-10 The Query Performance page in the SQL Database management portal
7. Click on one of the queries in the list to display the Query Plan Details. The portal displays
the query, the resources used by the query, and the details of the query’s execution plan, as
shown in Figure 9-11.
Note The query displayed on the Query Plan Details page is read-only. To edit
the query, click the Edit button in the ribbon at the top of the page.
8. Click the Query Plan link below the query to display the execution plan, as shown in
Figure 9-12. Using the buttons on the right, above the graphical query plan, you can toggle
the display of the execution plan display between the graphical, grid, and tree views. The icons
on the left side enable you to highlight the operations of the execution plan based on CPU or
IO cost and the types of operations.
9. Click an operation in the execution plan to view the details and performance cost of the
operation, as shown in Figure 9-13. If you are viewing a large execution plan with many
operations, the graphical execution plan can be zoomed in and out.
You have now used SQL Database management portal to identify queries in your database with a
high performance cost. Using this portal, you can drill into a query’s execution plan, identify expen-
sive operations, and review the details of a speciic operation. The SQL Database management portal
gets the expensive queries displayed in the Query Performance page from SQL Database dynamic
management views, which are explained in the next section.
Name Description
dm_db_index_usage_stats Returns counts of different types of index operations and the time
each type of operation was last performed
dm_db_partition_stats Returns page and row-count information for every partition in the
database
dm_db_wait_stats Returns information about all the waits encountered by threads that
executed during operation
Name Description
dm_db_index_operational_stats Returns current low-level I/O, locking, latching, and access method
activity for each partition of a table or index in the database
dm_db_index_physical_stats Returns size and fragmentation information for the data and indexes
of the speciied table or view
dm_db_missing_index_columns Returns information about database table columns that are missing
and index
One database metric that is particularly important to monitor is database size. If the size of your
database reaches the size quota, your queries will return error code 40544. (See Chapter 2 for more
information about the database size quota.) When you reach your size quota, you cannot insert or
update data or create new database objects until you increase the maximum size of your database or
delete data. Using DMVs, you can monitor the size of your database proactively and take the appro-
priate actions before your applications receive errors resulting from a database that has reached its
size quota.
More Info SQL Database raises errors unique to SQL Database, and each error is identiied
with an error code or error number. These errors include general errors for features,
objects, and syntax not supported by SQL Database, errors that occur when copying
databases, and connection errors. For more information and a list of the SQL Database
error codes, visit http://msdn.microsoft.com/en-us/library/ff394106.aspx.
1. If you closed the Microsoft Azure management portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
2. Click SQL DATABASES in the vertical navigation pane on the left. This displays a list of your
databases.
4. Click the MANAGE button at the bottom of the page to open the SQL Database management
portal.
5. Type the user name (for example, saz) and password you assigned when you created the
server, and click Log On. This displays the Summary page for the database.
FIGURE 9-14 Getting database size using DMVs in the SQL Database management portal
Execution
Execution-related DMVs provide insight into connections, sessions, and the requests that your SQL
Database servers receive. The execution-related DMVs that are included in SQL Database are listed in
Table 9-3. Execution-related DMFs are listed in Table 9-4.
Name Description
dm_exec_cached_plans Returns a row for each query plan that is cached by SQL Database
dm_exec_query_memory_grants Returns information about the queries that have acquired a memory
grant or that still require a memory grant to execute
dm_exec_sessions Returns information about all active user connections and internal
tasks
Name Description
dm_exec_describe_irst_result_set Returns the metadata description of the irst result set for the
statement
dm_exec_describe_irst_result_set_for_ Returns the metadata description of the irst result based on an object
object Id
dm_exec_query_plan Returns the Showplan in XML format for the batch speciied by the
plan handle
dm_exec_sql_text Returns the text of the SQL batch that is identiied by the speciied sql
handle
dm_exec_text_query_plan Returns the Showplan in text format for the batch speciied by the plan
handle or a speciic statement within the batch
Database connections are inite resources that get managed both on-premises and in the cloud.
Using execution DMVs, you can view all the active SQL Database connections and see connection
properties such as the login, how much CPU time the session is consuming, when the last request
occurred, and more.
1. If you closed the Microsoft Azure management portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
2. Click SQL DATABASES in the vertical navigation pane on the left. This displays a list of your
databases.
4. Click the MANAGE button at the bottom of the page to open the SQL Database management
portal.
5. Type the user name (for example, saz) and password you assigned when you created the
server, and click Log On. This displays the Summary page for the database.
FIGURE 9-15 Viewing SQL Database connections and sessions using execution DMVs in the SQL Database
management portal
Transaction
Transaction-related DMVs provide information about database transactions and locks. The transac-
tion-related DMVs included in SQL Database are listed in Table 9-5. SQL Database does not contain
transaction-related DMFs.
Name Description
Database contention, locking and deadlocks can cause application unresponsiveness, hangs, and
errors. You can use transaction-related DMVs to view current locking activity and blocking in your
SQL Database.
Event tables
Dynamic management views are very helpful tools when monitoring and troubleshooting current
activity, but if you are trying to research issues that occurred in the past, dynamic management views
cannot help with that because they provide information about current activity. To help investigate
issues that occurred in the past and are no longer occurring, SQL Database introduced two catalog
views in the master database, named sys.database_connection_stats and sys.event_log. These event
tables, as they are commonly referred to, collect and store database events that can be used to
troubleshoot past behavior.
Note The Azure PowerShell cmdlets can also be used to effectively automate the
management of SQL Database. See Chapter 2 for detailed information on downloading,
installing, and using the PowerShell cmdlets for Microsoft Azure SQL Database.
The Service Management API lies at the core of all the services in Microsoft Azure. The Service
Management API can be used to create new SQL Database servers and databases, manage the SQL
Database irewall rules, and even reset the administrator password. This API is based on HTTP and
REST, is publicly accessible, and can be used by any device or platform that can issue HTTP requests.
SQL Database APIs can be grouped into three major categories: APIs for managing SQL Database
servers, APIs for managing databases, and APIs for managing server-level irewall rules.
To use the Service Management API, you must authenticate and be authorized to manage the
Microsoft Azure subscription. You can authenticate using an X.509 v3 certiicate (referred to as a
management certiicate) or using OAuth 2.0 with Microsoft Azure Active Directory. To authenticate
with a management certiicate, you must either have a management certiicate that is already added
to your Microsoft Azure subscription or create a new X.509 v3 certiicate to use as a management
certiicate.
Now that you have created a new X.509 v3 certiicate to use for your Microsoft Azure management
certiicate, you need to upload it to your Microsoft Azure subscription.
1. If you closed the Microsoft Azure management portal since the last procedure, log in to the
Microsoft Azure management portal at https://manage.windowsazure.com. This brings you
to the main portal page showing ALL ITEMS.
2. Click SETTINGS at the bottom of the vertical navigation pane on the left.
3. Click the Management Certiicates link at the top of the page to display your management
certiicates.
4. Click the Upload button at the bottom of the page to display the Upload A Management
Certiicate dialog, as shown in Figure 9-16.
FIGURE 9-16 The Upload A Management Certiicate dialog in the Microsoft Azure management portal
5. Click the folder icon, and locate the MicrosoftAzureServiceManagementCertiicate.cer ile you
created in the previous procedure.
FIGURE 9-17 The list of management certiicates in the Microsoft Azure management portal
You have now created an X.509 v3 certiicate and uploaded it to your Microsoft Azure
subscription’s management certiicates. With the certiicate in place, you can authenticate to the
Microsoft Azure Service Management API and use the API to manage SQL Database.
In the following procedure, you will create a console application that makes HTTP requests to the
Service Management API to create a database. The console application code shown in Listing 9-2
retrieves your management certiicate from your local certiicate store by the thumbprint of the
certiicate. It then builds an HttpWebRequest for the Service Management REST API, adds the
management certiicate to the request, and executes the request asynchronously. The results and
response for the request are written to the console once the request has completed.
LISTING 9-2 The console application code using the Microsoft Azure Service Management REST API
using System;
using System.IO;
using System.Net;
using System.Security.Cryptography.X509Certificates;
using System.Text;
var certificateStore = new X509Store(StoreName.My, StoreLocation.CurrentUser);
certificateStore.Open(OpenFlags.ReadOnly);
X509Certificate2Collection certs = certificateStore.Certificates.Find
(X509FindType.FindByThumbprint, certThumbprint, false);
if (certs.Count == 0)
{
Console.WriteLine
("Couldn't find the certificate with thumbprint:" + certThumbprint);
return;
}
certificateStore.Close();
var request = (HttpWebRequest)HttpWebRequest.Create(new Uri(
"https://management.core.windows.net:8443/" +
subscriptionId +
"/services/sqlservers/servers/" + serverName + "/databases"));
request.Method = "POST";
request.ClientCertificates.Add(certs[0]);
request.ContentType = "application/xml";
request.Headers.Add("x-ms-version", "2012-03-01");
var sb = new StringBuilder("<?xml version=\"1.0\" encoding=\"utf-8\"?>");
sb.Append("<ServiceResource xmlns=\"http://schemas.microsoft.com/windowsazure\">");
sb.AppendFormat("<Name>{0}</Name>", databaseName);
sb.Append("<Edition>Web</Edition>");
sb.Append("<MaxSizeGB>1</MaxSizeGB>");
sb.Append("<CollationName>SQL_Latin1_General_CP1_CI_AS</CollationName>");
sb.Append("</ServiceResource>");
var formData = UTF8Encoding.UTF8.GetBytes(sb.ToString());
request.ContentLength = formData.Length;
using (var postStream = request.GetRequestStream())
{
postStream.Write(formData, 0, formData.Length);
}
Console.WriteLine("Creating Database: " + databaseName);
public static string EncodeToBase64String(string original)
{
return Convert.ToBase64String(Encoding.UTF8.GetBytes(original));
}
private static void RespCallback(IAsyncResult result)
{
var state = (RequestState)result.AsyncState;
var request = state.Request;
var response = (HttpWebResponse)request.EndGetResponse(result);
var statusCode = response.StatusCode.ToString();
var reqId = response.GetResponseHeader("x-ms-request-id");
Console.WriteLine("Creation Return Value: " + statusCode);
Console.WriteLine("RequestId: " + reqId);
}
}
public class RequestState
{
const int BufferSize = 4096;
public StringBuilder RequestData;
public byte[] BufferRead;
public WebRequest Request;
public Stream ResponseStream;
public Decoder StreamDecode = Encoding.UTF8.GetDecoder();
public RequestState()
{
BufferRead = new byte[BufferSize];
RequestData = new StringBuilder(String.Empty);
Request = null;
ResponseStream = null;
}
}
}
1. Launch Visual Studio 2013 as an administrator. From the Windows start screen, you can either
scroll through the tiles to ind it or just type visual studio 2013 to run an app search. Right-
click on the Visual Studio 2013 tile or result, and click Run As Administrator in the toolbar at
the bottom of the screen. This will launch Visual Studio 2013 as an administrator.
2. Click the FILE menu, and then choose New | Project to display the New Project dialog.
3. On the left of the New Project dialog, expand Templates, Visual C# and choose Console
Application.
4. Name the solution and project AzureServiceManagementApi, and choose any desired
location, as shown in Figure 9-18.
a. Replace <subscription-id> with your Microsoft Azure subscription Id. This can be found
under Settings | Subscriptions in the Microsoft Azure management portal.
c. Replace <server-name> with the name of the server that was assigned when you created
your SQL Database server.
6. Press F5 or click Debug | Start Debugging. This opens the console application and creates the
database, as shown in Figure 9-19.
FIGURE 9-19 Creating a database in SQL Database using the Service Management API
You have successfully created a SQL Database using the REST-based Microsoft Azure Service
Management API in a console application. You can verify your database was created using the
Microsoft Azure management portal by inding your new database in the SQL Database server
that you used in the previous procedure. In addition to creating databases, you can create, update,
delete, and view SQL Database resources, including servers, databases, and irewall rules. For more
information on the available Service Management APIs for SQL Database, visit http://msdn.microsoft.
com/en-us/library/gg715283.aspx.
You then automated the management of your SQL Database using the Service Management APIs.
The Service Management APIs are a collection of REST web APIs that enable you to programmatically
manage your services in Microsoft Azure. These REST APIs are central to Microsoft Azure man-
agement and provide the foundation for graphical management tools like the Microsoft Azure
management portal, and even command-line interfaces like the Microsoft Azure PowerShell cmdlets.
As your applications mature and you move toward production, you identify frequently repeated op-
erations that are great candidates for automating using the Service Management API (or PowerShell).
It’s important to automate these processes to save time and reduce human error.
This chapter introduced you to SQL Database monitoring and management capabilities, and it
provided a foundation for you to build upon. Now you can dig deeper into monitoring SQL Database
by creating your own monitoring queries using DMVs and DMFs. You can also further explore Service
Management REST APIs (as well as the PowerShell cmdlets covered in Chapter 2) that enable you to
automate management of SQL Database servers, database, irewall rules, and more.
A t the very beginning of this book, back in Chapter 1, we introduced you to the concept of cloud
computing. We began by describing the Infrastructure as a Service (IaaS), Platform as a Service
(PaaS), and Software as a Service (SaaS) acronyms. We also explained that Microsoft Azure SQL
Database in particular, is delivered as a platform—that is, it is a PaaS offering. With each successive
chapter, you then explored different focus areas of the SQL Database platform. And now that you
have arrived at the last chapter of the book, we present you with an end-to-end treatment for build-
ing a cloud solution on top of SQL Database. By “end-to-end,“ we mean a stack of layered compo-
nents, where each layer is concerned with its own area of responsibility, and collectively, they work
together to furnish a feature-complete application.
In this chapter, we show you how to combine SQL Database with other components to produce
your own complete SaaS solution. In other words, you will learn how to build layers on top of SQL
Database to deliver a ready-to-use application that runs completely in the cloud on Microsoft Azure.
The solution you will create in this chapter builds on the sample WineCloudDb database we’ve been
using to demonstrate throughout this book, and it includes a website that allows users to place orders
through their browser, and a mobile app that allows users to manage the wines in the database from
their Microsoft Windows Phone device.
You will build the application layer by layer, from the bottom up, starting from the database
level and working your way up to the user interface (UI). The complete solution stack is shown in
Figure 10-1.
Here is a high-level overview of the tasks you will perform in this chapter:
• Import the database schema from WineCloudDb into a new SQL Server database
project. This will enable you to use SQL Server Data Tools (SSDT) to work in ofline mode,
disconnected from SQL Database.
289
Windows
Browser
Phone
Website
ASP.NET MVC ASP.NET Web API
SQL Database
Stored Procedures
Tables
FIGURE 10-1 The complete solution is composed of these distinct application layers.
• Create stored procedures to control how data in the Order table can be inserted, updated,
or deleted.
• Deploy the ofline database project changes back to Microsoft Azure SQL Database.
• Use the Entity Framework (EF) to manage all database connections and commands.
• Design an Entity Data Model (EDM) to conigure how EF interacts with the tables and
stored procedures in the database.
• Build a Model-View-Controller (MVC) Web application, which users can access with any
browser to place orders.
• Create a Web API to expose create, retrieve, update, and delete (CRUD) operations for
wines in the database.
■ Create a Windows Phone 8 app with the Windows Phone Software Development Kit (SDK).
• Build a mobile app that communicates with the Web API to implement a wine catalog,
which users can use to view, add, modify, and delete wines with their Windows Phone 8
device.
You will build all of these pieces as separate but related projects inside a single Microsoft Visual
Studio solution.
So many choices
This chapter presents a complete, multitiered cloud solution, using SQL Database on the back
end. Several Microsoft technologies are readily available to achieve this—it is by no means nec-
essary to implement your cloud solution using the particular technologies we chose to use here.
You will create a data access layer using the Entity Framework, but another .NET data-access
technology (such as traditional ADO.NET) might be a perfectly suitable alternative, depending
on the scenario. Although you will create the website with the ASP.NET Model-View-Controller
(MVC) framework, you can certainly choose to do so using standard ASP.NET web forms with
.aspx pages. And for the web service, you will use the increasingly popular ASP.NET Web API to
implement Representational State Transfer (REST) protocol services, although other service plat-
forms such as Simple Object Access Protocol (SOAP) with Windows Communication Foundation
(WCF), WCF Data Services (which also offers quick and easy REST services over EF), or WCF RIA
(Rich Internet Application) Services, and others can be used as well.
Regardless of which particular technologies you choose, however, the core concepts of
multiple tiers and layered design presented in this chapter are the same.
2. If the SQL Server Object Explorer is not visible, click the VIEW menu and choose SQL Server
Object Explorer.
3. In the SQL Server Object Explorer, right-click SQL Server and choose Add SQL Server to display
the familiar Connect To Server dialog.
b. For Authentication, select SQL Server Authentication from the drop-down list. (SQL
Database does not support Windows Authentication.)
c. For Login and Password, type the user name and password you assigned the server when
you created it.
d. Click the Connect button. The server now appears as a collapsed node in the SQL Server
Object Explorer.
7. If a previous version of WineCloudDb is present from work you did in an earlier chapter, delete
it now by doing the following:
b. Click OK to conirm.
10. Type WineCloudDb, and press Enter. The new database now appears in the SQL Server
Object Explorer.
11. Right-click the WineCloudDb database, and choose New Query to open a new query window.
12. Type the code shown in Listing 10-1 into the query window (or paste it in from the listing ile
downloaded from the book’s companion website).
13. Press Ctrl+Shift+E to execute the script (or press the play button icon in the query window’s
toolbar).
14. Close the query window. (It isn’t necessary to save the changes.)
In this chapter, you will start with the pre-existing SQL Database in the cloud that you just created
using connected SSDT, and import its schema deinition into a new SQL Server database project. Then
you will continue developing the database using disconnected SSDT—that is, by working with the
ofline project and deploying changes incrementally to the live SQL Database.
1. In Visual Studio 2013, click the FILE menu, then choose New | Project to display the New
Project dialog.
2. On the left of the New Project dialog, expand Templates, Other Project Types, and choose
Visual Studio Solutions.
3. Select the Blank Solution template, name the solution WineCloudSolution, and choose any
desired location for the solution as shown in Figure 10-2.
The Solution Explorer now shows the new WineCloudSolution. (If the Solution Explorer is not
visible, click the VIEW menu and choose Solution Explorer.) Now that you have an empty solution,
you’re ready to create a new SQL Server Database project.
There are several ways to create a SQL Server Database project. You can start with an empty
project, design a database structure from the ground up inside the project, and then publish the
entire structure to a new SQL Database. Or, if you already have an existing SQL Database (such as
the WineCloudDb database in our scenario), you can import the database into the project. In the
next several procedures, you will create a new project and then import the WineCloudDb database
structure into the project.
1. Right-click WineCloudSolution in Solution Explorer, and choose Add | New Project to display
the Add New Project dialog.
2. On the left side of the New Project dialog, expand Installed and choose SQL Server.
3. Select the SQL Server Database Project template, and name the project WineCloudDb (it’s
usually a good idea to name the project after the database), as shown in Figure 10-3.
Your solution now has a single database project, but there are no items deined in the project yet.
By choosing a target platform, you are directing Visual Studio to validate the project and verify
that the database design is compatible with that particular version. The validation occurs in real
time—as you modify the project, Visual Studio constantly checks your design in the background and
raises errors if you attempt to do something that is not supported by the speciied target platform.
(Chapter 3 discusses important differences between Microsoft Azure SQL Database and on-premises
versions of SQL Server.)
When you create a new SQL Server Database project, the target platform is set to SQL Server 2012
by default. So before making any changes to the database project, it is a good idea to set the target
platform to let Visual Studio know that you intend to deploy the project to Microsoft Azure SQL
Database rather than on-premises SQL Server 2012.
To set the project’s target platform switch to SQL Database, follow these steps:
1. Right-click the WineCloudDb project in Solution Explorer, and choose Properties. At the top of
the Project Settings tab, notice that the Target Platform is set to SQL Server 2012.
3. Click the FILE menu, and choose Save Selected Items (or press Ctrl+S).
With this setting in place, you can work with the project secure in the knowledge that Visual Studio
will alert you if you attempt to do something that is not compatible with SQL Database speciically.
Now it’s time to import the WineCloudDb database into the project.
To import the WineCloudDb database from SQL Database into the project, follow these steps:
1. Right-click the WineCloudDb project in Solution Explorer, and choose Import | Database to
display the Import Database dialog, as shown in Figure 10-5.
2. Beneath Source Database Connection, click the New Connection button to display the
Connection Properties dialog.
3. For Server Name, type the complete host name for the SQL Database server. As usual, this is
the server name randomly assigned when you created the server, followed by
.database.windows.net.
4. Choose Use SQL Server Authentication, and type the user name and password you previously
assigned to the server.
5. Click the drop-down list beneath the Select Or Enter A Database Name radio button, and
select the WineCloudDb database. The Connection Properties dialog should now appear as
shown in Figure 10-6.
6. Click OK to close the Connection Properties dialog and return to the Import Database dialog.
7. Click Start. It takes just a few moments for Visual Studio to examine the database and discover
all the objects it contains, as shown in Figure 10-7.
FIGURE 10-7 Importing the WineCloudDb SQL Database into the WineCloudDb SQL Server
Database project
8. Click Finish.
To create the new Price column in the Wine table, follow these steps:
1. In Solution Explorer, expand the dbo folder, and then expand the Tables folder beneath dbo.
2. Right-click the Wine.sql ile, and choose View Designer (or just double-click the Wine.sql ile).
This opens the designer in a split-screen view; the top half of the designer displays a grid that
shows all the columns, and the bottom half displays the T-SQL code that creates the table with
those columns.
3. In the grid at the top of the designer, click in the Name cell in the empty row at the bottom of
the grid.
4. Type Price in the Name cell, and then press Tab to advance to the Data Type cell.
5. Type money in the Data Type cell, and then press Tab.
6. Clear the Allow Nulls check box, and then press Tab. This means that SQL Database will not
permit null values when storing rows in the table; each wine will have to have a price.
7. Type 0 in the Default cell. This is necessary because the Wine table already contains rows of
data. Because null values are not permitted in the Price column, this default value will assign
a price of 0 to each existing row in the table when you deploy the new design back to SQL
Database.
8. Click the FILE menu, and choose Save Wine.sql (or press Ctrl+S).
Tip In this procedure, you applied a change to the design grid on top, and Visual Studio
automatically updated the T-SQL code on the bottom. However, the table designer sup-
ports bi-directional editing. So you can also apply your changes by editing the T-SQL code
directly on the bottom, and Visual Studio will automatically update the design grid on the
top. You will use this technique shortly when you add the Order table to the database in an
upcoming procedure.
You had to assign default values for the Price column because the Wine table already contains
data, and the table has been designed not to permit null values in this new column. Thus, a default
must be established at this point because some value needs to be assigned to the Price column in the
existing rows. However, once the new table is deployed and the existing rows are updated with the
default values, you might want to remove the default value assignment from the table design so that
new rows added in the future would be required to supply non-NULL values for Price.
1. Right-click the WineCloudDb project in Solution Explorer, and choose Publish to display the
Publish Database dialog, as shown in Figure 10-9.
2. Click the Edit button to the right of the Target Database Connection to display the familiar
Connection Properties dialog. Supply the connection information to the WineCloudDb
database as you’ve done before:
a. For Server Name, type the complete host name for the SQL Database server (the server
name followed by database.windows.net).
c. Type the user name and password you previously assigned to the server.
d. Click the drop-down list beneath the Select Or Enter A Database Name radio button, and
select the WineCloudDb database (if not already selected by default).
3. Click OK to close the Connection Properties dialog. The Publish Database dialog should now
appear as shown in Figure 10-10.
4. Click the Save Proile As button, type WineCloudDb, and click Save. This saves the connection
information you just entered to a ile named WineCloudDb.publish.xml so that you won’t need
to reenter it every time you deploy again in the future.
Tip You can click the Generate Script button instead of clicking Publish. This will
also invoke the schema compare operation and generate the change script for
the deployment. But rather than executing the change script, Visual Studio will
open it in a new query window. This gives you the opportunity to view the script
so that you can see exactly what actions will be taken. Then you can choose to
execute the script as-is, edit it, or save it to be executed later.
During the deployment process, Visual Studio displays the progress and status in the Data Tools
Operations window. Figure 10-11 shows the Data Tools Operations window once the deployment
completes successfully.
FIGURE 10-11 Deployment status is displayed in the Data Tools Operations pane
1. If the SQL Server Object Explorer is not visible, click the VIEW menu and choose SQL Server
Object Explorer.
2. In the SQL Server Object Explorer, expand the SQL Server node.
3. Beneath the SQL Server node, expand the server node for the SQL Database (the one with
your server name followed by .database.windows.net).
7. Right-click the dbo.Wine table node, and choose View Data. This opens a new window to four
rows for the wines added in Listing 10-1, all of which have a price of 0 because of the default
value you assigned to the Price column when you added it to the table.
8. Click in the Price cell on the irst row, and change the value from 0 to 34.90 (or just make up
any price).
9. Repeat the previous step for each of the three remaining rows, changing the Price column
in those rows from 0 to 48.50, 42.00, and 52.00 (or, again, assign any ictitious values). Your
screen should appear similar to the one shown in Figure 10-12.
FIGURE 10-12 Using the SQL Server Object Explorer to edit prices in the Wine table
In the next procedure, you will return to the database project to create the Order table, and then
deploy the project once again to SQL Database. This demonstrates the iterative development cycle
you follow when designing databases with a SQL Server Database project in Visual Studio:
■ Deploy the changes to SQL Database via a publish process. This generates and executes a
change script based on a schema compare operation between the project and the database.
2. Beneath the dbo folder, right-click the Tables folder and choose Add | Table.
3. Name the table Order.sql, and click Add to open the table designer. The designer starts with
a single integer column named Id that is already deined as the table’s primary key.
b. In the Properties window, expand Identity Speciication and change the (Is Identity)
property from False to True. (If the Properties window is not visible, click the VIEW menu
and choose Properties Window.) When you insert new orders into the table, this tells SQL
Database to automatically assign incrementing integer values for this column in each new
row.
a. Type OrderedOn in the Name cell beneath OrderId, and press Tab to advance to the Data
Type cell.
6. Add the remaining columns using the code window instead of the table schema grid by
completing the following steps:
a. Click in the code window beneath the table schema grid to place the text cursor
immediately before the closing parenthesis character.
The CustomerId and WineId columns are foreign keys to the Customer and Wine tables,
respectively, so the last step in designing the Order table is to establish foreign-key relationships on
these columns. Doing so will ensure that an order cannot be placed for customers or wines that don’t
actually exist.
To create the foreign-key relationship between the Order and Customer tables, follow these steps:
1. In the upper right area of the table designer, right-click Foreign Keys and choose Add New
Foreign Key.
2. Name the new foreign key FK_Order_Customer. (It is best practice to assign foreign-key
names that indicate which tables participate in the relationship.) This generates an incomplete
FOREIGN KEY clause in the T-SQL code window at the bottom of the designer.
3. Edit the FOREIGN KEY clause in the T-SQL code window to read FOREIGN KEY (CustomerId)
REFERENCES Customer(CustomerId).
Next, repeat the same steps to create the foreign-key relationship between the Order and Wine
tables:
1. In the upper right area of the table designer, right-click Foreign Keys and choose Add New
Foreign Key.
3. Edit the second FOREIGN KEY clause added in the T-SQL code window at the bottom of the
designer to read FOREIGN KEY (WineId) REFERENCES Wine(WineId).
4. Click the FILE menu, and choose Save Order.sql (or press Ctrl+S).
The completed design for the Order table should now appear as shown in Figure 10-13.
Creating stored procedures to facilitate access to the table protects the database against storing
invalid data and ensures that critical business calculations and validation rules cannot be bypassed. So
rather than allowing direct access to the table, client applications will be given indirect access to the
tables via stored procedures that apply whatever rules you choose to enforce. In a sense, this estab-
lishes a “service layer“ over the tables in the database. In the next several procedures, you will create
three stored procedures for the Order table to ensure that the following rules are in place:
■ The Quantity column in every row is always assigned a positive number greater than zero.
■ The UnitPrice column in every row is always derived from the current price of the wine
speciied by the WineId column.
■ The Price column in every row is always calculated as the result of multiplying Quantity
and Price.
■ For updated rows, the UpdatedOn column is always assigned the current date and time on the
database server, and the original AddedOn column is never overwritten.
The stored procedures that enforce these rules are shown in Listing 10-2 (insert), Listing 10-3
(update), and Listing 10-4 (delete):
-- Ensure orders less than one year old are never deleted
IF @DaysOld < 365
THROW 50000, ‘Orders less than one year old cannot be deleted’, 1;
The UpdateOrder stored procedure performs the same validation on the incoming @Quantity
parameter to ensure that an existing order’s quantity is not changed to a zero or negative number. It
also repeats the same pricing logic to recalculate @UnitPrice and @Price if an existing order’s quantity
or wine selection is changed. (Certainly, the common pricing code can be maintained in a single user-
deined function that is shared by both the InsertOrder and UpdateOrder stored procedures.) The
@UpdatedOn variable is then declared and assigned the current date and time by the SYSDATETIME
function. An UPDATE statement then updates the row (using the current date and time in
@UpdatedOn for the UpdatedOn column). Notice that the AddedOn column is not affected by the
UPDATE statement, which ensures that the date and time stored in AddedOn at the time the row was
created can never be modiied.
In the DeleteOrder stored procedure, the number of elapsed days since OrderDate is calculated
and stored in the @DaysOld variable. The @@ROWCOUNT function is then tested to verify that the
speciied order actually exists, and then the @DaysOld variable is tested to ensure that the date of
the existing order is at least one year (365 days) ago. Finally, a DELETE statement deletes the speciied
order from the table.
More Info There are other techniques besides stored procedures that can protect tables
from invalid data. For example, you can deine a check constraint on the Quantity column
to ensure that negative numbers are not permitted, instead of testing for that condition in
a stored procedure. You can also create a trigger to check that an order is at least one year
old before permitting its row to be deleted. In general however, triggers are best avoided,
because they introduce nondeterministic behavior. (That is, when multiple triggers are de-
ined, the order in which they ire is not guaranteed, which can lead to subtle bugs that are
dificult to troubleshoot.) Using stored procedures is often the best approach, because they
provide a clean layer over tables in which you can consolidate all your custom logic (such
as calculating prices, and controlling the date and time values assigned to the AddedOn
and UpdatedOn columns, as shown here).
1. In Solution Explorer, right-click the dbo folder and choose Add | New Folder.
3. Right-click the Stored Procedures folder, and choose Add | Stored Procedure.
4. Name the new ile for the stored procedure InsertOrder.sql, and click Add.
5. Replace the template code generated automatically by Visual Studio with the code shown
earlier in Listing 10-2. Your screen should appear similar to Figure 10-14.
FIGURE 10-14 Adding a new stored procedure to the SQL Server Database project
6. Right-click the Stored Procedures folder, and choose Add | Stored Procedure.
7. Name the new ile for the stored procedure UpdateOrder.sql, click Add, and replace the
template code with the code shown earlier in Listing 10-3.
8. Right-click the Stored Procedures folder, and choose Add | Stored Procedure.
9. Name the new ile for the stored procedure DeleteOrder.sql, click Add, and replace the
template code with the code shown earlier in Listing 10-4.
10. Click the FILE menu, and choose Save All (or press Ctrl+Shift+S).
1. Right-click the WineCloudDb project in Solution Explorer, and choose Publish to display the
Publish Database dialog. (See Figure 10-10 earlier in the chapter.)
2. Click the Load Proile button toward the bottom of the dialog.
3. Double-click the WineCloudDb.publish XML ile to load the connection information that you
saved during the previous deployment.
As when you deployed the irst time, Visual Studio generates and executes the change script that
updates the database in the cloud to match the database design in the project. This time, that means
the WineCloudDb SQL Database is updated with the new Order table and its foreign-key relationships
to the Customer and Wine tables, as well as the three new stored procedures for inserting, updating,
and deleting rows in the Order table. Refresh the WineCloudDb database node in SQL Server Object
Explorer to verify that the database now contains the new objects published from the project.
The database design is complete, and you are now ready to start working on the solution’s data
access layer.
Of course, the Microsoft .NET Framework includes ADO.NET, which provides a set of classes you
can use to build a data access layer. Since the very irst version of .NET released in 2002, develop-
ers have had two choices for working with ADO.NET. One option is to use the raw ADO.NET objects,
which include connections, commands, and readers. This approach requires a lot of manual effort,
because you need to write explicit code to connect to the database, issue commands to request data,
transfer the requested data from readers into objects, track changes made to those objects, and
then inally issue commands to send the modiied objects back to the database. Although somewhat
The second conventional ADO.NET choice is to use the Dataset object in conjunction with data
adapters. Visual Studio provides a graphical Dataset designer that automatically generates a lot of
code for you. The generated code conigures the connection and command objects, and it maps indi-
vidual data elements (tables and columns) between the database and the strongly typed in-memory
Dataset object. Once populated, a Dataset can track its own changes, making it relatively easy for you
to push updated data back to the database. This approach provides a layer of abstraction that relieves
you from a great deal of the manual effort required to achieve the same result using the raw ADO.
NET classes. However, the Dataset is not a true business object or entity. Today, therefore, you won’t
ind many scenarios where it makes sense to use the Dataset rather than EF when building the DAL in
a new .NET application.
The overriding point is that it’s far more important that you have a properly implemented DAL in
place than which approach you actually decide to take. Certainly, every case is different, but in many
common line-of-business (LOB) scenarios, you will ind EF more than well suited for the task. EF dra-
matically simpliies data access by abstracting away the underlying database connection, command,
and reader objects and providing a robust set of object services capable of materializing objects
retrieved from querying the database, tracking them in memory, and pushing their changes back to
the database. EF can also dynamically generate SELECT statements to query the database and INSERT,
UPDATE, and DELETE statements to update the database, or it can provide the same object services
equally well by invoking stored procedures that let you maintain total control over how queries are
executed and updates are processed. There are also many more advanced mapping possibilities in EF
that are far beyond the scope of this chapter, such as the ability to model inheritance, entity-splitting,
table-splitting, and many-to-many relationships.
The EDM consists of three parts. First there is the storage schema, which describes the physical
structure of the database. Then there is the conceptual schema, which describes classes for the busi-
ness entities used in the application. Finally, you have the mapping schema, which deines how the
storage and conceptual schemas relate to one another. These three pieces (collectively called the
model’s metadata) are self-contained in a single .edmx ile inside your project.
The EDM you will create in the next section deines a simple one-to-one mapping between the
conceptual and storage schemas. However, you should be aware that much more complex mappings
are possible. For example, a single entity in the conceptual schema might be mapped to multiple
tables in the database, in which case EF will join the tables together at query time and split the
updates across them when saving changes. Conversely, multiple entity types might be mapped to a
single table in the database, in which case EF distinguishes each row in the table based on a designat-
ed column that identiies the entity type. This type of mapping can also be used to deine inheritance
in the conceptual schema.
In this section, you will create the EDM in its own class library project for an easily shareable
DAL. Then you will reference the DAL class library from a separate ASP.NET MVC project. As you will
soon see, it is easy to copy the connection string from the class library’s app.conig ile into the MVC
project’s web.conig ile to access the database.
1. Right-click WineSolution in Solution Explorer, and choose Add | New Project to display the Add
New Project dialog.
2. On the left side of the New Project dialog, expand Installed and choose Visual C#.
3. Select the Class Library template, and name the project WineCloudModel.
5. The Class1.cs ile created automatically by Visual Studio can be deleted, so right-click it,
choose Delete, and click OK.
You are now ready to create the Entity Data Model in the WineCloudModel project.
1. Right-click the WineCloudModel project in Solution Explorer, and choose Add | New Item to
display the Add New Item dialog.
2. On the left side of the Add New Item dialog, expand Installed, Visual C#, and choose Data.
3. Click the ADO.NET Entity Data Model item, and name the ile WineModel.edmx, as
shown in Figure 10-15.
5. On the Choose Model Contents page, the Generate From Database option is already
selected by default, so just click Next.
6. On the Choose Your Data Connection page, click the New Connection button.
8. In the familiar Connection Properties dialog, supply the same connection information to
connect to the WineCloudDb database that you’ve used throughout this chapter:
b. Choose Use SQL Server Authentication, and type the user name and password you
assigned to the server.
c. This time, be sure and select the Save My Password check box. Otherwise, the password
will not be saved for the connection string at runtime in step 9.
d. In the drop-down list beneath the Select Or Enter A Database Name radio button, select
the WineCloudDb database.
9. Choose Yes to include sensitive data (namely, the password) in the connection string. The
Entity Data Model Wizard should now appear as shown in Figure 10-16.
FIGURE 10-16 Setting the data connection in the Entity Data Model Wizard
11. The desired version, Entity Framework 6.0 should already be selected by default, so just click
Next to display the Choose Your Database Objects And Settings page.
12. Expand Tables, dbo, and then select the check boxes for the Customer, Order, and Wine tables.
(Don’t include the _RefactorLog table in the model; this table was generated automatically and
is used only by the refactoring features of the SQL Server Database Project.)
13. Expand Stored Procedures and Functions, dbo, and then select the dbo check box to select
the InsertOrder, UpdateOrder, and DeleteOrder stored procedures in the database.
14. Deselect the last check box in the dialog, Import Selected Stored Procedures And Functions
Into The Entity Data Model. The Entity Data Model Wizard should now appear as shown in
Figure 10-17.
More Info This step isn’t strictly necessary, but it does prevent needless
overhead in the EDM. Deselecting this check box means that you never intend
to call the InsertOrder, UpdateOrder, and DeleteOrder stored procedures directly
via EF, and it’s therefore not necessary for the wizard to create function imports
and complex types (these are essentially strongly-typed wrappers around stored
procedure calls and the schema results returned by those stored procedure calls).
Instead, you will shortly map these stored procedures to the Order entity so that
EF calls them automatically whenever it needs to save changes to the Order table
in the database.
FIGURE 10-17 Selecting tables and stored procedures to be imported into the Entity Data Model
Note After clicking Finish, you might receive multiple Security Warning messages for
running the template. (This refers to the special template used internally by the EDM
designer to automatically generate code.) If you receive this warning, just click OK. (You
can also select the check box to prevent the template security warning from appearing
again repeatedly.)
Visual Studio adds the necessary EF references to the project, generates the Entity Data Model,
and then displays it in the EDM designer as shown in Figure 10-18.
FIGURE 10-18 The generated Entity Data Model displayed in the EDM designer
The Wine, Customer, and Order entities displayed in the EDM designer represent classes that
correspond to tables of the same name that were discovered in the database. Similarly, each of the
entity classes has properties that are mapped to columns of the same name in each table. Further-
more, notice that each entity has navigation properties and associations that are based on foreign-key
relationships discovered between the tables in the database:
■ The Wine entity has the navigation properties Customers and Orders (both plural).
• In Listing 10-1, you established a foreign relationship between the Customer table’s
FavoriteWineId column and the WineId primary-key column in the Wine table. You deined
this column to allow nulls, meaning that some customers might not have a FavoriteWineId
value. Thus, the association between the entities is displayed graphically in the designer
by a connecting line, with a “0..1“ appearing on the Wine side (which indicates zero or one
favorite wines) and an asterisk (*) symbol (which indicates many) on the Customer side.
The Customers navigation property is on the “many“ side of this relationship, so all the
• Later, you established a foreign-key relationship between the Order table’s WineId column
and the WineId primary-key column in the Wine table. You deined this column not to
permit nulls, meaning that every Order row must have a WineId value identifying the wine
that was ordered. Thus, the association between the entities is displayed graphically in the
designer by a connecting line, with a “1“ appearing on the Wine side (which indicates one
and only one wine) and an asterisk (*) symbol (indicating many) on the Order side. The
Orders navigation property is on the “many“ side of this relationship, so all the orders for a
given wine are accessible through that Wine entity’s Orders navigation property.
■ The Customer entity has navigation properties Wine (singular) and Orders (plural).
• The Wine property is on the “zero or one“ side of the customer’s favorite wine relationship,
so each customer’s favorite wine (if that customer has one) can be accessed through the
Customer entity’s Wine navigation property.
• You also established a foreign-key relationship between the Customer table’s OrderId
column and the OrderId primary-key column in the Order table. You deined this column
not to permit nulls, meaning that every Order row must have a CustomerId value iden-
tifying the customer that placed the order. Thus, the association between the entities is
displayed graphically in the designer by a connecting line, with a “1“ appearing on the
Customer side and an asterisk (*) symbol (indicating many) on the Order side. The Orders
navigation property is on the “many“ side of this relationship, so all the orders for a given
customer can be accessed through the Customer entity’s Orders navigation property.
■ The Order entity has the navigation properties Customer and Order (both singular).
• The Customer property is on the “one“ side of the order’s customer relationship, so each
order’s customer can be accessed through the Order entity’s Customer navigation property.
• Similarly, the Wine property is on the “one“ side of the order’s wine relationship, so each
order’s wine can be accessed through the Order entity’s Wine navigation property.
The Entity Data Model Wizard imported both tables and stored procedures. But unlike the tables,
which the wizard also maps to same-named entities in the conceptual model, stored procedures do
not get mapped automatically. So it’s still your job to map the three stored procedures (InsertOrder,
UpdateOrder, and DeleteOrder) to the Order entity in the model. By default (that is, if you don’t do
this), EF will simply generate direct T-SQL INSERT, UPDATE, and DELETE statements when you save
changes to the Wine table, and you won’t get the added functionality (such as custom validation and
pricing logic) that is programmed into the stored procedures.
Recall that InsertOrder returns a single-row resultset with the new OrderId value assigned to the
new Order row. (See the SELECT statement at the bottom of Listing 10-2.) When you map this stored
procedure to the Order entity, you inform the EDM of the value it returns by deining result bindings.
This instructs EF to refresh new Order entity objects by “shoving“ the return value back into the
OrderId property of the memory-resident instance after performing an insert.
1. Right-click on the Order entity, and choose Stored Procedure Mapping. This displays the
Mapping Details window.
2. In the Mapping Details window, click <Select Insert Function>, expand the drop-down list,
and choose the InsertOrder stored procedure. The designer automatically maps the stored
procedure input parameters to the same-named entity properties, but you need to map the
OrderId value returned by the stored procedure back into the entity manually in the next step.
3. Beneath Result Column Bindings, click <Add Result Binding>, type OrderId, and press
Enter. The designer correctly maps this result column to the OrderId property.
4. Click <Select Update Function>, expand the drop-down list, and choose the UpdateOrder
stored procedure. Again, the stored procedure parameters get mapped automatically to the
same-named entity properties of the Order entity.
5. Click <Select Delete Function>, expand the drop-down list, and choose the DeleteOrder
stored procedure.
6. Click the FILE menu, and choose Save All (or press Ctrl+Shift+S).
Note Once again, click OK on any template Security Warning dialogs that
appear.
After completing this procedure, the Mapping Details window should appear as shown in
Figure 10-19.
FIGURE 10-19 The EDM designer Mapping Details window with stored procedures mapped to the Order entity
Note This chapter presents a very simple application, which is just enough to demonstrate
how multiple components in a layered solution interact. In real-world scenarios, authenti-
cation and authorization must also be implemented; however, those details lie beyond the
scope of this chapter.
An MVC website works by examining the requested URL and determining which controller, and
which action on that controller, the request should be directed to. A controller is really just a class,
and an action is really just a method of that class. When the action method runs, it returns the view
that gets rendered as Hypertext Markup Language (HTML) and JavaScript in the client browser. MVC
binds the view to a model that deines the data that gets supplied to the view.
The rules that govern how a URL maps to speciic controllers and actions are speciied in the MVC
application’s routing table. Default behavior (such as which controller and action is invoked when
none is speciied in the URL) is also conigured in the routing table. The default routing table for a
new MVC application speciies a default controller named Home with a default action named Index,
which means that the Index method of the HomeController class will be invoked for a URL that does
not specify a controller and action.
You will also use the “scaffolding“ template feature in Visual Studio to create the HomeController
class. This code-generation feature automatically creates several actions in the controller class, along
with individual views that correspond to each action. When used with Entity Framework, these scaf-
folded actions and views fully implement standard select, insert, update, and delete functionality for
any entity in the EDM.
1. Right-click WineSolution in Solution Explorer, and choose Add | New Project to display the Add
New Project dialog.
2. On the left side of the New Project dialog, expand Installed, Visual C#, and choose Web.
4. Name the project WineCloudWeb (as shown in Figure 10-20), and click OK.
6. Select the MVC and Web API check boxes, as shown in Figure 10-21. This adds project
references to the MVC assemblies to support the website, as well as references to the Web API
assemblies for the REST services you will add later to support the mobile Windows Phone 8
app.
FIGURE 10-21 Selecting core references for MVC and Web API in a new empty ASP.NET web
application project
7. Click OK.
Before the WineCloudWeb MVC application can use the EDM in the WineCloudModel project as
the model for the UI, two things need to be done:
■ The entity connection string must be copied from the WineCloudModel project to the
WineCloudWeb application.
You will perform both these tasks in the next two procedures. First, to reference the
WineCloudModel project from the WineCloudWeb project, follow these steps:
1. Expand the WineCloudWeb project in Solution Explorer to reveal its References node.
2. Right-click the References node, and choose Add Reference to display the Reference Manager
dialog.
3. Expand the Solution item on the left, and click the Projects tab beneath Solution. This allows
you to select from other projects in the solution to reference.
FIGURE 10-22 Adding a reference from the ASP.NET Web application project to the DAL project
5. Click OK.
CHAPTER 10 Building cloud solutions 323
Even though the EDM and DAL are in the WineCloudDb project, EF always looks in the
coniguration ile of the launching executable application or website at runtime to ind the entity
connection string, which in turn, contains the actual database connection string. When the EDM is
created in a class library project, as is the case here, the connection string is contained in the class
library project’s App.Conig ile. However, the connection string in App.Conig will never be found at
runtime, because a class library is a DLL ile with no entry point (that is, it can never be the launching
executable application).
In this solution, WineCloudWeb is the launching application, so EF will look inside its Web.conig
ile for the entity connection string whenever data access is required. If the connection string is not
present in Web.conig, EF won’t ind it at runtime and will throw an exception as a result. So you need
to perform a simple copy/paste operation to resolve the situation.
To copy the entity connection string and paste it into Web.conig, follow these steps:
1. Copy the connection string from App.Conig in the WineCloudModel project by doing the
following:
a. Expand the WineCloudModel project in Solution Explorer to reveal its App.Conig ile.
c. Select the entire <connectionStrings> section. This should contain a single connection
named WineCloudDbEntities and include the surrounding <connectionStrings> and
</connectionStrings> tags.
2. Paste the connection string to Web.conig in the WineCloudWeb project by doing the
following:
a. Expand the WineCloudWeb project in Solution Explorer to reveal its Web.conig ile.
c. Click to position the text cursor just after the <coniguration> element and just before the
<appSettings> element at the top of the ile.
3. Click FILE and choose Save Web.conig (or press Ctrl+S) to save the changes.
The WineCloudWeb project is now all set up to use the EDM deined in the WineCloudModel
project as the model in the MVC application.
In the next procedure, you will scaffold a new controller with views and actions for the Order
entity. As we mentioned earlier, the Home controller is the default controller if one is not speciied
on the URL. Therefore, you will name the controller HomeController (even though OrderController
is arguably a better name, given the controller’s purpose). By naming it HomeController, you won’t
need to specify anything in the URL to get to the Home controller’s Index action, and the default MVC
routing rules won’t need to be modiied.
To create the scaffolding for a new Home MVC controller with actions and views for the Order
entity, follow these steps:
Important This project must be built before it can be used by other projects that
reference it. If you don’t irst build this project, you will encounter errors when
attempting to add the scaffolded views in this procedure, because they are based
on the EDM in the WineCloudModel project.
2. Right-click the Controllers folder in the WineCloudWeb project in Solution Explorer, and
choose Add | New Scaffolded Item.
3. In the Add Scaffold dialog, select MVC 5 Controller With Views, Using Entity Framework, as
shown in Figure 10-23.
FIGURE 10-23 The Add Scaffold dialog has several choices for creating a new MVC controller class.
4. Click Add.
b. For Model Class, choose Order (WineCloudModel) from the drop-down list.
d. Deselect the Reference Script Libraries and Use A Layout Page check boxes. The Add
Controller dialog should appear similar to Figure 10-24.
FIGURE 10-24 Adding an MVC 5 controller class, with automatically generated views for EF
e. Click Add.
Look at the WineCloudWeb project in the Solution Explorer, and take a moment to review what
Visual Studio just created for you. First open the HomeController.cs class that was added to the
Controllers folder. If you examine the code, you will notice several things:
■ The class inherits from the System.Web.Mvc.Controller base class, which is what makes this an
MVC controller class.
■ Several public methods that return an ActionResult object have been created. These are the
controller’s action methods. Based on a combination of the URL syntax of an incoming request
and the HTTP method used to issue the request (GET or POST), one of these action methods
will be called to handle the request.
■ Each action method is preceded by a comment line that indicates the type of HTTP request
(GET or POST) and URL syntax that the action method will handle:
• A POST request responds by creating a new order, modifying an existing order, conirming
the deletion of an existing order, or deleting an existing order.
■ Each action method returns a view to satisfy the request. The actual view that gets returned is
based on the action and the model object returned in the action method’s ActionResult.
■ The irst action method is named Index, which matches the default action of Index when no
action is speciied in the URL with a GET request. Because the Home controller is also the
default controller, this Index method is the one that will be called if no controller and action
is speciied in the URL. This method retrieves all the orders in the database, along with each
order’s related customer and wine objects. It then returns a view that matches the action name
and a model object for the list of orders, which is the (same-named) Index view.
You can (and probably will) modify or extend the controller class to accommodate speciic
requirements of your application. For example, you can add and remove actions, or you can change
their behavior. For this project, the scaffolding has generated all the actions needed to support
viewing, adding, modifying, and deleting orders with the website, so the generated code is ready to
be used.
Next have a look at the Views folder in Solution Explorer. Expand the Views folder, and notice that
it now contains a Home subfolder. This is more of the convention-based approach that MVC takes:
things are found by name. Thus, views that serve the actions of a speciic controller are contained in
a subfolder beneath Views that is named after the controller. Expand the Home subfolder and you
will see several .cshtml iles. These are the view iles generated by the scaffolding, and there is one
for each of the Home controller actions. Again, by MVC naming convention, the view iles are named
after the action method of the controller:
■ Create.cshtml
■ Delete.cshtml
■ Details.cshtml
■ Edit.cshtml
■ Index.cshtml
With the model coming from the EDM in the WineCloudModel DAL project, and the views and
controllers generated by scaffolding, there is just a small amount of manual work needed to get this
MVC website up and running. Speciically, the generated Create and Edit views include HTML input
controls for every property of the Order entity, including properties you actually don’t want the user
to provide values for. Recall that logic in the InsertOrder (shown in Listing 10-2) and UpdateOrder
(shown in Listing 10-3) stored procedures are responsible for setting the AddedOn and UpdatedOn
properties to the current date and time, and that they calculate the UnitPrice and Price properties
based on the particular wine and quantity being ordered. The scaffolding logic is smart, but it’s not
smart enough to understand that you don’t want input ields for these four properties present in the
Create and Edit views. So it’s up to you to remove them yourself.
1. Expand the Home folder beneath the Views folder of the WineCloudModel project in Solution
Explorer.
3. Among the <div> elements for each Order property, ind and delete the four <div> elements
for the UnitPrice, Price, AddedOn, and UpdatedOn properties.
5. Repeat the same edit you just performed for Create.cshtml to delete the <div> elements for
UnitPrice, Price, AddedOn, and UpdatedOn.
6. Click the FILE menu, and choose Save All (or press Ctrl+Shift+S).
1. Right-click the WineCloudWeb project in Solution Explorer, and choose Set As Startup Project.
2. Press F5 (or click the Internet Explorer play button in the toolbar) to launch the website in
Internet Explorer. As already explained, with no controller or action speciied in the URL, this
navigates to the Index view of the Home controller by default. This view displays a list of all
the orders in the system, which is empty at this time. The view also provides a Create New link,
which navigates to the Create view.
3. Click the Create New link to navigate to the Create view of the Home controller. The data
entry screen appears as shown in Figure 10-25.
9. After the order is saved to the database, the browser redirects to the Index action of the
Home controller. This displays the list of orders, including the order you just placed, as shown
in Figure 10-26.
10. Just to get the feel for it, click Create New to enter another order or two. Also try out the Edit,
Details, and Delete links.
To suit your needs in a production scenario, you surely need to take this project much further.
Beyond obvious aesthetics, one of the many things you would still need to do on your own to cus-
tomize this project for a production application is to implement proper exception handling. If an error
occurs in SQL Database when you attempt to save an order, a DbUpdateException gets thrown on
the EF call to the SaveChanges method in the HomeController class. Because you haven’t written any
exception-handling code, Visual Studio will break on the error. When you press F5 to continue, the
browser will display the default (and rather unfriendly) unhandled error page, because you haven’t
designed a friendlier unhandled error page customized for your own application. The default error
page shows the underlying SqlException that was thrown by SQL Database. If the SqlException was
thrown because of a THROW statement in one of the validation checks inside the stored procedures,
the error page also shows the message text of the validation rule.
With no exception handling or client-side validations in place, it’s easy to prove that the validations
in the stored procedures embedded in the EDM are working as expected. You should simply
encounter an unhandled exception if you attempt to enter an invalid order. Again, of course, for a
production application, you need to implement a far more robust exception-handling strategy that
can distinguish between different types of errors, determine whether an error message is safe or
unsafe to display, and might also include additional logging, notiication, and navigation logic. You
For example, if you try to create or update an order with a quantity value lower than 1, the
quantity validation in either the InsertOrder or UpdateOrder stored procedure will THROW an error. In
turn, you will receive an error page as shown in Figure 10-27.
FIGURE 10-27 The unhandled exception page for an error thrown by the stored procedure’s quantity validation
You can also test the logic in the DeleteOrder stored procedure that protects orders less than a
year old from being deleted. If you create a new order with an OrderedOn date more than one year
old, you will have no problem deleting it. But if you try to delete an order less than a year old, you will
receive an error page as shown in Figure 10-28.
Up to this point, you have been running the website locally, even though the local website has
been interacting with the live Azure SQL Database in the cloud. In the next section, you will deploy
the website to Microsoft Azure so that both the website and the database are running in the cloud.
5. In the URL text box, type winecloudweb. This speciies that the website will be accessible
at http://winecloudweb.azurewebsites.net.
6. In the REGION drop-down list, choose the same (or nearest) region you selected for the
SQL Database. Keeping all the application components hosted in the same region maximizes
performance and avoids billing fees. (See Chapter 2, “Coniguration and pricing,” for more
information on pricing.) Your screen should appear similar to Figure 10-29.
7. Click CREATE WEB SITE. It takes just a few moments to create the website, and the portal
indicates that the website is running, as shown in Figure 10-30.
8. Click winecloudweb in the NAME column to display the dashboard page for the website.
9. Beneath Publish Your App, click the Download The Publish Proile link.
10. After a brief moment, the publish settings ile is generated and you are prompted to
open or save it, as shown in Figure 10-31.
FIGURE 10-31 Generating a publish proile for deploying a Microsoft Azure website
11. Click the drop-down portion of the Save button, and choose Save As.
Using the proile you just downloaded, you can now deploy the project to the website from Visual
Studio. To do so, follow these steps:
2. Right-click the WineCloudWeb project in Solution Explorer, and choose Publish to display the
Publish Web dialog, as shown in Figure 10-32.
3. Click Import.
5. Navigate to the top-level folder of the WineCloudOrderWeb project where you saved the
publish proile in the previous procedure.
7. Click OK. This advances to the Connection page, which gets populated automatically with
information loaded from the proile, as shown in Figure 10-33.
10. Click Publish. Because of all the assemblies used by the project, it can take several minutes to
deploy the irst time. However, subsequent deployments will typically take only a few seconds.
Everything is now running in the cloud—not just the website, but the DAL, and SQL Database as
well. All of that is running on hardware you never need to manage, worry about, touch, or see. Go
and enjoy the beneits of having a complete multitiered solution hosted on Microsoft Azure. You can
reach it from any browser, anywhere, anytime.
Once deployed, Visual Studio automatically launches the browser to the website. By default, it
displays the Index view of the Home controller, which should show the same data you entered while
testing the website locally. This is because the local website used the WineCloudDb SQL Database
deployed on Microsoft Azure, and that’s the same database that the deployed website is now using.
To run the application from any browser on any machine, just navigate to http://winecloudweb.
azurewebsites.net (of course, you need to adjust this URL for the name you supplied when you created
the website, which will be different if the name winecloudweb has already been taken). Take some
time now and use the website to enter a few more orders, as you did earlier when you were testing
the site locally.
Web API implements REST services, which are typically easier to create and more lightweight than
other types of services, such as SOAP-based services using WCF. (See the “So many choices“ sidebar
at the beginning of this chapter.) To facilitate data access over HTTP, REST services are mapped to
standard HTTP request verbs such that a GET request retrieves data, a POST request inserts data, a
PUT request updates data, and a DELETE request deletes data. The URL of the request is parsed by the
Web API runtime to determine which controller action to invoke.
For example, a GET request with a URL that ends with /api/Wine responds by returning all the
wines in the database, while the same GET request with a URL that ends with /api/Wine/5 returns
only a single wine with the WineId value of 5. The other HTTP verbs package additional information
for the request in the HTTP request header. This is information that is either impractical or impossible
to encode in the URL. For example, a PUT request with a URL that ends with /api/Wine/3 means that
a single wine with the WineId value of 3 is to be updated, while the actual data for the updated wine
(the updated name, category, year, and price) is embedded into the HTTP request header as a simple
dictionary of key value pairs.
Data is returned by Web API services as a string in JavaScript Object Notation (JSON) format.
Recall that when you create the project, you also chose to include the core assemblies for Web API.
(See Figure 10-21.) One of those assemblies is Newtonsoft.Json, which is a popular library for serial-
izing objects into JSON strings and deserializing JSON strings back into object instances. For a GET
request, the Web API calls into this library to serialize objects into a JSON-formatted string to return
to the client. The client, in turn, can also call into this library to deserialize the JSON-formatted string
received from the GET request into a live object on the client.
Like MVC website applications, Web API services are deined inside an ASP.NET Web Application
project, and you can certainly create another ASP.NET Web Application project to be used only for
Web API services. For simplicity, however, you will add the Web API services to the same ASP.NET
Web Application project you just created for the MVC website. The project already includes the core
assemblies for Web API (as you can see in Figure 10-21), so it is all ready to host Web API services in
addition to the MVC website; all you need to do is add a Web API controller class to the project.
Visual Studio provides a scaffolding feature similar to the one you used earlier to generate the
MVC controller with actions and views. In the case of a new Web API controller, the scaffolding fully
supports CRUD operations against any entity in the EDM. As you will see, the default routing engine
rules maintain separation between MVC and Web API controllers based on the presence or absence
of /api in the URL. Requests without /api in the URL are directed to MVC controllers, and those with
/api in the URL are routed to Web API controllers.
Note If you experience déjà vu as you create and test the Web API controller, it might be
because you recently completed Chapter 8. You created and tested the very same Wine
Web API controller in Chapter 8 while learning how to handle transient SQL Database
errors with Entity Framework.
To create the scaffolding for a new Wine Web API controller with CRUD actions for the Wine entity,
follow these steps:
1. Right-click the Controllers folder in the WineCloudWeb project in Solution Explorer, and
choose Add | New Scaffolded Item.
2. In the Add Scaffold dialog, select Web API 2 Controller With Actions, Using Entity Framework,
as shown in Figure 10-34.
FIGURE 10-34 The Add Scaffold dialog has several choices for creating a new Web API controller class.
3. Click Add.
b. For Model Class, choose Wine (WineCloudModel) from the drop-down list.
d. Click Add.
Now take a look at what has been created for you. Open the WineController.cs class, and examine
the code that was generated. Notice the following:
■ The class inherits from System.Web.Http.ApiController, which is what makes this a Web API
controller class.
■ Several public action methods have been created. Based on the URL of an incoming request,
and the type of the incoming request, one of these action methods will be called.
■ Each action method is preceded by a comment line that indicates the type of HTTP request
and URL syntax that the action method will handle:
■ The action method signatures (their names and parameters) are expressed in a slash-delimited
format on the URL, following the controller name.
■ As indicated in the comment line above each method, the Web API routing engine prepends
/api to the action name in the URL. As we mentioned, this ensures isolation between HTTP
requests for an MVC controller (without /api in the URL) and HTTP requests for a Web API
control (with /api present in the URL).
There is one small additional step you need to take before you can start using this new Web API
controller. When the JSON result is received from the service, the JSON serializer attempts to serialize
all related entities that it discovers, which can result in circular references that cause errors. To avoid
this problem, the JSON serializer must be told to ignore circular references.
If you followed along with the procedures in Chapter 8, you might be wondering why this extra
measure must be taken, because it wasn’t necessary in that chapter when you created a Wine control-
ler just as you did now. The answer is the Order table, which is present in this chapter’s WineCloudDb
To instruct the JSON serializer that circular references should be ignored, follow these steps:
1. Expand the App_Start folder of the WineCloudWeb project in Solution Explorer to reveal the
WebApiConig.cs ile.
3. Add the following line of code to the bottom of the Register method:
config.Formatters.JsonFormatter.SerializerSettings.ReferenceLoopHandling =
Newtonsoft.Json.ReferenceLoopHandling.Ignore;
4. Click the FILE menu, and choose Save App_Start\WebApiConig.cs (or press Ctrl+S).
1. Press F5 (or click the Internet Explorer play button in the toolbar) to launch the website in
Internet Explorer. As usual, this displays the Index view of the Home controller.
2. Append the URL in the browser’s address bar with api/Wine, and press Enter. This executes
the GetWines action method in the WineController class and responds with the list of wines
from the WineCloudDb database. Internet Explorer’s default behavior asks if you would like to
save or open the results from the Web API call, as shown in Figure 10-36.
FIGURE 10-36 The browser prompts to open or save the Wine.json ile returned by the Wine API
3. Click Open to view the list of wines returned in the JSON results, as shown in Figure 10-37.
(If prompted for how to open this type of ile, click More Options and choose Notepad.)
4. Back in the browser, append the URL in the browser’s address bar with api/Wine/2, and press
Enter. This executes the GetWine method on the WineController and responds with the record
for WineId 2.
5. Click Open to view the JSON response in Notepad. This time, the response includes just
the single requested wine.
At this point, you are running the Web API services locally, even though the local services are
interacting with the live Azure SQL Database in the cloud. In the next section, you will deploy
WineCloudWeb again so that the MVC website, Web API services, and the database are all running in
the cloud.
To deploy the updated project with the new Web API controller to Microsoft Azure, follow these
steps:
1. Right-click the WineCloudWeb project in Solution Explorer, and choose Publish to display
the Publish Web dialog. Because you already deployed this project earlier, the dialog opens
directly to the Preview page, ready to publish using the previous settings, as shown in
Figure 10-38.
2. Click Publish. Within just a few moments, the project is deployed, and Visual Studio opens
a new browser window to the site on Microsoft Azure.
3. Verify that the deployed Web API services work properly by testing them just as you did
locally. Simply tweak the URL in the browser’s address bar with different /api/Wine requests,
and ensure the correct JSON results are returned in response.
Everything is now in place to support the Windows Phone 8 app, which will call into the Web API
services you just created.
The Windows Phone SDK contains everything you need to build Windows Phone 8 apps, including
a number of Visual Studio project templates designed speciically for a Windows Phone device. The
programming model is essentially Silverlight, meaning that if you have any Silverlight (or Windows
Presentation Foundation [WPF]) experience, you already possess the essential skills needed to quickly
With the release of Visual Studio 2013, Microsoft made an important change in the way it ships this
SDK. Previously, the SDK was not included with Visual Studio, and a separate download was always
required. This remains true with earlier Visual Studio versions, but now Visual Studio 2013 includes
the SDK. However, as shown in Figure 10-39, the option to install the SDK is deselected by default. So
unless you have overridden this default at setup time by selecting the Windows Phone 8.0 SDK check
box, you will need to re-run setup now and select that check box. This means you will need access
to the original Visual Studio 2013 distribution media. You will also need to close Visual Studio before
installing the SDK. When you re-run setup, you will be prompted with the choices Modify, Repair, and
Uninstall. Choose Modify, select the Windows Phone 8.0 SDK check box, click UPDATE, and then click
Yes when the User Account Control dialog appears. Be prepared to wait for a while; the SDK has a
lengthy installation process.
Important If you are running Visual Studio 2013, it will not be suficient to ind and
download the SDK from Microsoft’s website. Doing so will install the SDK, but only for
Visual Studio 2012. If you don’t have Visual Studio 2012, the SDK will include the Visual
Studio 2012 shell with project templates for Windows Phone, but you still won’t have
Windows Phone templates in Visual Studio 2013. The only way to add the SDK to Visual
Studio 2013 is to re-run setup and select the check box.
FIGURE 10-39 By default, the Visual Studio 2013 Setup dialog does not select the Windows Phone 8.0 SDK
1. If you closed Visual Studio 2013 to install the Windows Phone 8.0 SDK, restart it now and
reopen WineCloudSolution.
2. Right-click WineCloudSolution in Solution Explorer, and choose Add | New Project to display
the Add New Project dialog.
3. On the left of the New Project dialog, expand Installed, Visual C#, and choose Windows
Phone.
4. Choose the Windows Phone App template, which is typically selected by default.
5. Name the project WineCloudPhone, and click OK, as shown in Figure 10-40.
6. Click OK.
Visual Studio creates the project, and you’re ready to start building the phone app.
Adding Json.NET
As we began explaining, this phone app will call the Web API services you recently created in
the WineCloudWeb ASP.NET Web Application project. Those services will return Wine entities on
the server as JSON-formatted strings to the client, so you will want to be able to access Json.NET
(the Newtonsoft.Json library) on the Windows Phone. This will let you easily deserialize the JSON
responses received from the service into Wine object instances on the phone.
1. Right-click the WineCloudPhone project in Solution Explorer, and choose Manage NuGet
Packages.
3. In the Search Online text box in the upper-right, type json.net (as shown in Figure 10-41)
and press Enter.
FIGURE 10-41 Downloading and adding a reference to Json.NET using the NuGet Package Manager
5. After Json.NET is installed, click Close to close the Manage NuGet Packages dialog.
With the Json.NET reference in place, it will be easy to deserialize JSON responses from the Web
API service in the phone app.
Windows Phone apps have two iles per page: Extensible Application Markup Language (XAML),
and .NET (C# or Visual Basic) code-behind. As you will see, XAML has powerful binding capabilities.
XAML binding features greatly reduce the amount of code-behind you need to write. This is particu-
larly signiicant, because there are no scaffolding features in Visual Studio to build out the XAML as
there is to build out the HTML views in an MVC application. Listing 10-5 shows the Wine model class,
Listing 10-6 shows the XAML markup for the main page of the wine phone app, and Listing 10-7
shows the code-behind for the main page.
namespace WineCloudPhone
{
public class Wine
{
public int WineId { get; set; }
public string Name { get; set; }
public string Category { get; set; }
public int? Year { get; set; }
public decimal Price { get; set; }
}
}
<phone:PhoneApplicationPage
x:Class="WineCloudPhone.MainPage"
xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
xmlns:phone="clr-namespace:Microsoft.Phone.Controls;assembly=Microsoft.Phone"
xmlns:shell="clr-namespace:Microsoft.Phone.Shell;assembly=Microsoft.Phone"
xmlns:d="http://schemas.microsoft.com/expression/blend/2008"
xmlns:mc="http://schemas.openxmlformats.org/markup-compatibility/2006"
mc:Ignorable="d"
FontFamily="{StaticResource PhoneFontFamilyNormal}"
FontSize="{StaticResource PhoneFontSizeNormal}"
Foreground="{StaticResource PhoneForegroundBrush}"
SupportedOrientations="Portrait" Orientation="Portrait"
shell:SystemTray.IsVisible="True"
Loaded=”PhoneApplicationPage_Loaded”>
<Grid x:Name="LayoutRoot" Background="Transparent">
<Grid.RowDefinitions>
<RowDefinition Height="Auto"/>
<RowDefinition Height="*"/>
</Grid.RowDefinitions>
using Microsoft.Phone.Controls;
using Newtonsoft.Json;
using System;
using System.Net;
namespace WineCloudPhone
{
public partial class MainPage : PhoneApplicationPage
{
private const string WebApiUrl = @”http://winecloudweb.azurewebsites.net/api/Wine”;
public MainPage()
{
InitializeComponent();
}
// GET api/Wine
wc.DownloadStringAsync(new Uri(WebApiUrl));
}
WineListPanel.Visibility = Visibility.Visible;
WineDetailPanel.Visibility = Visibility.Collapsed;
LayoutRoot.DataContext = wines;
}
_wine = (Wine)e.AddedItems[0];
DeleteButton.Visibility = Visibility.Visible;
WineListPanel.Visibility = Visibility.Collapsed;
WineDetailPanel.Visibility = Visibility.Visible;
LayoutRoot.DataContext = _wine;
}
if (_wine.WineId == 0)
{
// POST api/Wine
wc.UploadStringAsync(new Uri(WebApiUrl), “POST”, wineInfo);
}
else
{
// PUT api/Wine/5
var webApiUrl = string.Format(“{0}/{1}”, WebApiUrl, _wine.WineId);
wineInfo = string.Format(“WineId={0}&{1}”, _wine.WineId, wineInfo);
wc.UploadStringAsync(new Uri(webApiUrl), “PUT”, wineInfo);
}
}
if (confirm != MessageBoxResult.OK)
{
return;
}
// DELETE api/Wine/5
var wc = new WebClient();
wc.Headers[HttpRequestHeader.ContentType] = “application/x-www-form-urlencoded”;
wc.UploadStringCompleted += WinesUpdated;
LoadWines();
}
}
}
We’ll explain all of this code in just a moment, right after you implement it in the project. To do so,
follow these steps:
1. Right-click the WineCloudPhone project in Solution Explorer, and choose Add | Class.
3. Replace the template code with the code shown earlier in Listing 10-5 (or paste it in from the
listing ile downloaded from the book’s companion website).
5. In the XAML code editor, replace the template code generated automatically by Visual Studio
with the code shown earlier in Listing 10-6 (or paste it in from the listing ile downloaded from
the book’s companion website).
7. Replace the template code with the code shown earlier in Listing 10-7 (or paste it in from the
listing ile downloaded from the book’s companion website).
8. Edit the URL in the WebApiUrl string constant at the top of Listing 10-7 with the name you
used for the Microsoft Azure website, assuming the name winecloudweb was not available to
you.
9. Click the FILE menu, and choose Save All, or press Ctrl+Shift+S.
First we’ll explain the Wine model class (Listing 10-5). This class relects the Wine entity in the EDM
in the DAL. Web API services are very lightweight, and consequently, they do not advertise their sche-
mas with metadata as WCF services can. Thus, there is no way to automatically generate client-side
proxy classes that relect the service-side entities, which means you manually need to re-create the
desired entity classes (the Wine class, in this case) to represent a model in the phone app.
Now shift focus to the main page itself. The key points to note about the XAML markup in Listing
10-6 are the two StackPanel objects, named WineListPanel and WineDetailPanel. As mentioned, only
one of these panels at a time will be made visible by the code-behind. In the markup, WineListPanel
has a ListBox that displays all the wines, and embedded within the ListBox is a DataTemplate that
deines how each wine in the list should be rendered. The template has three TextBlock elements,
which render three parts of each wine. The Text property in each TextBlock is set to a Binding with
a Path that points to a speciic property of the Wine class: Name, Year, and Category. Also note
that the Style of the irst TextBlock is set to PhoneTextExtraLargeStyle, while the other two are set to
PhoneTextSubtleStyle. This means that the wine name will appear nice and large in the list, with the
year and category beneath in a smaller font size.
The list box also has a SelectionChanged event handler that ires in the C# code-behind whenever
the user selects a wine from the list (that is, when the user taps one). When the code-behind
responds, WineListPanel gets hidden and WineDetailsPanel gets made visible so that the user can edit
the selected wine. There is also an Add button beneath the text box that the user can tap to add a
new wine. This also toggles the panel, but it binds the UI to a new Wine object, rather than an existing
Wine object from the list.
The second panel, WineDetailsPanel, contains four TextBox controls for editing the information of
a new or existing wine. Again, the controls have a binding path to a speciic Wine property, but this
time, Mode=TwoWay is also speciied. This small but critical detail is all that’s needed to implement
bi-directional binding—properties of the bound object appear in the text boxes, and changes made
by the user in those text boxes are pushed back into the bound object. Beneath the text boxes is a
Save button to push the changes through the Web API service, through the DAL, and back to the SQL
Database. There is also a Cancel button, which returns to the wine list panel, discarding any changes
made on the details panel. Finally, there is a Delete button, which is hidden when entering a new
wine.
Now turn your attention to the code-behind shown in Listing 10-7. First, notice the using statement
for Newtonsoft.Json. This imports the namespace for the Json.NET library so that it can be easily called
Right beneath the constant, you see the _wine variable deined as a private (page-level) variable of
type Wine. This variable will hold an instance of the new or existing Wine object being created or up-
dated in WineDetailsPanel. The page has a DataContext property that represents the object currently
bound to the XAML. When the list panel is being displayed, the DataContext is set to the array of
wines returned by the Web API service (which displays all the wines in the list box), and when the de-
tails panel is displayed, the DataContext is set to the _wine variable (which bi-directionally data-binds
one speciic wine to the text boxes).
In the PhoneApplicationPage_Loaded event, you can see that LoadWines is called. The LoadWines
method prepares a WebClient object to issue an HTTP request, and it sets the HTTP Accept header
to application/json. This lets the server know that the client is able to receive a response in JSON
format. Next, the code registers a handler for the DownloadStringCompleted event. This speciies
WinesLoaded as the callback function to the asynchronous service call. What this means is that your
code doesn’t wait for a response after it calls the service. That would be a synchronous service call,
which is never permitted in Silverlight, because synchronous calls block the UI while waiting for the
service response. To keep the device responsive while interacting with services, only asynchronous
service calls are allowed. So rather than waiting for a response, the WinesLoaded method is called
automatically when the DownloadStringCompleted event ires, which means that the service is ready
to return its response.
At this point, wines is populated with the complete list of wine objects, deserialized from the JSON
response returned by the service. The visibility properties are now set to show the list panel and hide
the detail panel, and the LayoutRoot.DataContext property is set to wines, which binds the array to
the list box.
When the user clicks the New button, the NewWineButton_Click event handler ires. The event
handler code simply assigns _wine to a new Wine object instance, setting a few default properties to
Next, the WineListBox_SelectionChanged event handler ires when the user taps on a wine in the
list. This event actually ires several times, as selected items are either added or removed. The code is
interested only in the event that ires when there is exactly one object in the AddedItems property of
the SelectionChangedEventArgs variable passed in as e. It then extracts the tapped wine object into
_wine, Finally, it toggles the panel and sets LayoutRoot.DataContext to _wine, just as it does for a new
wine (though this time the Delete button is made visible).
When the user clicks the Save button, the SaveButton_Click event handler ires and it’s time to
push the changes back to the Web API service, which either creates or updates a wine in the database.
The irst part of the save logic works the same in either case. The code creates a new WebClient object
and sets the HTTP ContentType header to application/x-www-form-urlencoded, which lets the service
know that the client is sending URL-encoded strings in the HTTP header. Then it registers a handler
for the UploadStringCompleted event. This speciies WinesUpdated as the callback function to the
asynchronous service call. After registering on the UploadStringCompleted event, a URL-encoded key-
value-pair string containing the wine’s Name, Category, Year, and Price properties is built and stored
in wineInfo. At this point, the save logic is handled differently for creating a new wine or updating an
existing wine.
If the WineId property is zero, this is a new wine. In this case, the UploadStringAsync method is
called. This method takes a URL as a parameter, which is the URL to the Wine Web API services. It
also accepts an HTTP method parameter, which is set to POST. Thus, the code issues an asynchronous
POST request over HTTP, which inserts the new wine. If WineId is not zero, this is an existing wine. In
this case, UploadStringAsync is still called, but with a few differences. First, the URL for the Web API
call is appended with /id (where id is the value of WineId). Second, the WineId property is prepended
in the URL-encoded key-value-pair string because it is part of an existing wine entity. And third, the
HTTP method parameter is set to PUT. Thus, the code issues an asynchronous PUT request over HTTP,
which updates the existing wine.
When the user clicks the Delete button, the DeleteButton_Click event handler ires, and the user is
irst prompted to conirm before deleting the wine. Then the code creates a new WebClient object,
sets the HTTP ContentType header to application/x-www-form-urlencoded, registers a handler for the
UploadStringCompleted event, and appends /id to the URL for the Web API call—just the same as
when updating an existing wine. This time, however, only the wine ID is passed in the HTTP header;
there is no need to pass all the properties of an entity that is about to be deleted. The code then
issues the asynchronous DELETE request over HTTP, which deletes the existing wine.
The WinesUpdated method is the callback function for all three asynchronous service calls:
POST, PUT, and DELETE. As with WinesLoaded, this method irst examines the Error property of the
UploadStringCompletedEventArgs parameter passed into the callback to ensure that the service call
succeeded without an error. If an error occurred, a message is displayed. Otherwise, LoadWines is
called. This queries the service again to retrieve an updated wine list, and then toggles the display
panels to view the list.
1. Right-click the WineCloudPhone project in Solution Explorer, and choose Set As Startup
Project.
2. Press F5 to build and run the app in the phone emulator (which can take a long time to load).
The wine list should appear as shown in Figure 10-42.
Tip When you want to stop the app during a debugging session, don’t close the
emulator. Instead, just stop your code execution inside Visual Studio and keep
the emulator running so that it can host your app the next time you press F5.
FIGURE 10-42 The wine list displayed in the Windows Phone emulator
3. Click on the third wine. The details page for Mendoza is displayed for editing.
Spend some time now to experiment a bit more with the app. Go ahead and create some new
wines, update a few of them, and delete one or two. Then congratulate yourself—you’ve built a
complete layered cloud solution on Microsoft Azure SQL Database, end-to-end, cloud-to-phone!
Summary
This chapter covered a lot of ground. There are myriad ways to create cloud solutions, and many
different tools and technologies are available for you to create them with. To be successful with any
of them, you need to design a properly layered stack of components to deliver a reliable, maintain-
able, and scalable solution. In this chapter, you learned how to do just that using Microsoft Azure SQL
Database and readily available Microsoft .NET technologies.
When creating your own solutions, you have many more alternatives than those presented in this
chapter. For example, you don’t need to use Entity Framework; you can instead create your own data
access layer using conventional ADO.NET. Instead of creating the website using ASP.NET MVC, you
can choose to create it using ASP.NET web forms. And rather than using Web API, you can build the
service layer using raw WCF, WCF Data Services, or WCF RIA Services. Regardless of which particular
technologies you choose, however, applying the principles presented in this chapter will guide you in
implementing a proper layered design across the various tiers of your cloud solution.
357
BACPAC iles
358
databases
359
databases
360
IP addresses
361
JavaScript Object Notation (JSON)
362
ORM (object relational mapping)
363
PaaS (Platform as a Service)
data transfer, 51
P ile storage, 54
PaaS (Platform as a Service), 4, 218 geographic location and bandwidth pricing, 51
partitioning data, 58, 61, 250–259 online pricing calculator, 50
page and row-count information, 276 pricing structures, 7
sharding, 13, 251–259 storage, 50–51
password policy, 107 storage space, 51–52
passwords in connection strings, 316 support plans, 53–54
performance primary keys, 67
in the cloud, 218 primary transactional database, replicating, 178–179
of cloud reporting, 124 privacy, 98–99
of data access, 217 public cloud platforms See Microsoft Azure
database connections overhead, 234 public cloud vendors
latency, reducing, 243–244 auditing services, 99
optimizing and tuning, 217, 244–245 compliance certiications, 99
of queries, 245, 271–275 hackers, protecting against, 99
permissions, 108–111, 170 malicious attack prevention, 98
physical administration, 57 multitenancy management, 98
physical data centers, 98, 99 security responsibilities, 98–99
physical storage with PaaS, 4 publish process, 301–303, 312
Plain Old CLR Object (POCO) classes, 223–224, 230 .publishsettings ile, 47
Platform as a Service (PaaS), 4, 218 PUT requests, 352
port 80, 130–132
port 1433, 58
port allocation, 58 Q
POST requests, 326, 352
PowerShell, 44–50 queries, 24–26
administrator credentials, 48 editing, 273
-Collation switch, 49 encapsulating, 27, 243–244
coniguring for Microsoft account, 46–47 error code 40544, 276
context associated with server and credentials, 48 execution plans, 273–275
database collation, changing, 49 fan-out, 256, 259
database edition, changing, 49 memory grants, 278
database size, changing, 49 optimizing, 245
databases, creating, 48–49 performance, 271–275
databases, deleting, 50 query plan details, 272–273
databases, viewing, 49 query plans, cached, 278
-Edition switch, 49 query plans, showing, 278
irewall rules, creating, 48 running, 193–194
-Force switch, 50 in stored procedures, 141, 243–244
management, automating, 281 query code, 140–141, 157
master databases, 49 query ilters, 25–26
-MaxSizeGb switch, 49 Queue storage, 244–245
server access, 47–48 Quick Create database creation option, 31–33
servers, creating, 47–48 quick-start links, 36, 38–39
PowerShell Integrated Scripting Environment (ISE), 46
PowerShell Microsoft Azure cmdlets, 44–45, 48, 49
Premium edition of SQL Database, 58, 245–250 R
preview release software, 120–121, 174, 246 RDL authoring tools, 125
pricing for Microsoft Azure .rdl iles, 169
bandwidth, 9, 51–53, 79 .rdp iles, 128
Blob Storage, 54 read-only permissions, 108–111
364
schemas
records, aging policies, 56 Reporting Services See SQL Server Reporting Services
Red-Gate Data Compare, 63 (SSRS)
Red-Gate SQL Compare, 63 reports
redundancy, data, 60–61 bar charts, 164–166
reference databases, 174, 184, 186–187 creating with Report Builder, 137–150
referential integrity of data, 19 creating with Visual Studio, 156–162
regulatory requirements, complying with, 99 data source, 125, 135, 138–140, 158–159
relational data, storage in SQL Database, 244–245 dataset, 125, 135, 138, 140–143
relational databases, 250 date and time of execution, 138
See also databases; SQL Database deploying to SSRS, 135, 147–149, 166–168
reliability, 217, 234–244 drill-down capabilities, 163
Remote Desktop sessions, 128 executing, 146
remote IP addresses, enabling database access, 37–38 formatting, 162–163
replication, 61 layout, 125, 135, 143–145
Report Builder, 123, 125 matrixes in, 157, 160
browser, running report from, 149–150, 168–169 previewing, 145–147, 163
data sources for reports, 138–140 queries, deining, 159–160
datasets for reports, 138, 140–143 RDL format, 125
date and time of report execution, 138 running from browser, 149–150, 168–169
Getting Started dialog, 137 running locally, 145–146
images, adding to reports, 139 saving, 145
installing, 135–137 security, 170
launching, 137 summarizing data in, 157
layout of report, 143–145 title, 143
matrix control, 138–139 user role assignment, 170
previewing reports, 135, 145–147 repository pattern, 314
Report Data pane, 138–139 requests. See HTTP requests and responses
report parameters, deining, 139 monitoring with DMVs and DMFs, 277–280
reports, creating, 137–150 Resource Governor, 61
reports, saving and running locally, 145–147 REST services, 336
Row Groups and Column Groups panes, 139 RESTful Web APIs, 218–228
SSRS, deploying reports to, 147–149 restore database operations, 117–119
table control, 138–139 RetryPolicy class ExecuteAction method, 239–243
user interface, 138 round trips, minimizing, 243–244
Report Deinition Language (RDL), 123 routing, 179
Report Designer, formatting data, 162–163 in MVC applications, 321
Report Manager, 130, 149, 170
report server, 124, 130–132
Report Server Database Coniguration Wizard, 129–130 S
Report Server Project Wizard, 158–162
SaaS (Software as a Service), 4, 289
Report Server projects, 150–169
scalability, 2–4, 218, 245–250
reporting, 123–171
functional partitions, 250–251
Report Builder, 135–150
OLTP activity, separating from reporting activity, 184
requests, performance impact of, 179
partitioning for, 58
sample database, creating, 132–135
sharding, 251–259
security of reports, 170
synchronizing between multiple locations, 178–179
service URL, 149
SCC (source code control), 294–295
virtual directory, creating, 129
schemas, 156, 313
Visual Studio Report Server projects, 150–169
changing before migration, 80
VM, creating, 125–132
compare operations, 301–303
VM, shutting down, 171
deploying to SQL Database, 91, 94
mapping, 313–314
365
schemas
366
SQL Server Integration Services (SSIS)
367
SQL Server Management Objects (SMO)
368
Visual Studio
369
Visual Studio
370
About the authors
About Tallan
Tallan (http://www.tallan.com) is a national technology consulting irm that
provides web development, business intelligence, customer relationship manage-
ment, custom development, and integration services to customers in the inancial
services, healthcare, government, retail, education, and manufacturing industries.
Lenni is also chief technology oficer (CTO) and cofounder of Sleek Technologies,
Inc., a New York-based development shop with an early-adopter philosophy toward
new technologies. He is a sought-after and highly rated speaker at industry conferences
such as Visual Studio Live!, SQL PASS, SQL Bits, and local technology user group
meetings. He is also lead author of Programming Microsoft SQL Server 2012
(Microsoft Press, 2012). Lenni can be reached at lenni.lobel@tallan.com or
lenni.lobel@sleektech.com.
Eric D. Boyd is the Founder and CEO of responsiveX
(www.responsiveX.com), a Microsoft Azure MVP, and a regular
speaker at national conferences, regional code camps and
local user groups. He is so passionate about apps and cloud
services that he founded responsiveX, a management and
technology consultancy that helps customers create great
web, mobile and client experiences, and these apps are often
powered by cloud services. Eric launched his technology
career almost two decades ago with a web-development startup and has served in
multiple roles since, including developer, consultant, technology executive and business
owner. You can ind Eric blogging at http://www.EricDBoyd.com and on Twitter at
http://twitter.com/EricDBoyd.
Free ebooks
www.microsoftvirtualacademy.com/ebooks
Microsoft Press
Now that
you’ve
read the
book...
Tell us what you think!
Was it useful?
Did it teach you what you wanted to learn?
Was there room for improvement?