0% found this document useful (0 votes)
225 views

Microsoft Windows Server 2012 R2 - Administration: File Services and Encryption

The document discusses implementing and managing the Distributed File System (DFS) in Windows Server 2012 R2. It covers topics like DFS namespaces, standalone vs domain-based namespaces, DFS candidates for scenarios like server consolidation and branch offices, and requirements for DFS like being built into Windows and needing servers in the same Active Directory forest.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
225 views

Microsoft Windows Server 2012 R2 - Administration: File Services and Encryption

The document discusses implementing and managing the Distributed File System (DFS) in Windows Server 2012 R2. It covers topics like DFS namespaces, standalone vs domain-based namespaces, DFS candidates for scenarios like server consolidation and branch offices, and requirements for DFS like being built into Windows and needing servers in the same Active Directory forest.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 69

Course Transcript

Microsoft Windows Server 2012 R2 -


Administration: File Services and Encryption
Distributed File Systems and FSRM
1. The Distributed File System in Windows Server 2012 R2

2. Implementing DFS in Windows Server 2012 R2

3. Managing DFS in Windows Server 2012 R2

4. The File Server Resource Manager in Server 2012 R2

5. Configuring FSRM in Windows Server 2012 R2

Securing File Systems


1. Using BitLocker Encryption in Windows Server 2012 R2

2. Configuring EFS in Windows Server 2012 R2

3. Using Audit Policies in Window Server 2012 R2

4. Implementing DFS, FSRM, Encryption, and Auditing


The Distributed File System in Windows Server
2012 R2
Learning Objective
After completing this topic, you should be able to
◾ differentiate between standalone and domain-based namespaces

1. Meet your instructor


Microsoft Windows Server 2012 R2 - Administration: File Services and Encryption

[Welcome to Microsoft Windows Server 2012 R2 - Administration: File Services and


Encryption.]

One of the challenges that we all face is the problem of file server sprawl, right. Every
department wants its own file server. Everybody has got a file server. And I have got people
creating these files out there on the network, and many times it is an unmanaged kind of
process. So what do we do about that? Well today, in Windows Server 2012 and 2012 R2, we
have the File Server Resource Manager role and the File Server Resource Manager role is
going to let me create file quotas. It is going to let me create a file screen so that only the
appropriate kinds of documents get saved to my file servers.

We want to take a look at the File Server Resource Manager role for managing our file servers
and we want to talk about encryption. If I have sensitive data, how do I encrypt the entire
contents of a drive or the portable drive with BitLocker? Or how do I educate my users to do
file-level encryption with Distributed File System, or DFS? We will take a look at this in this
course.

[The goal of this course is to work with File Services and encryption in Windows Server 2012
R2.]

2. Distributed file system overview


One of the problems that we all face is file server sprawl. Every department wants a file server,
you are mapping network drives for everybody, users are all over the place, right? And what I
want to do is I want to simplify that. I want to grant users access to a variety of different
servers, but do so through a single namespace. So that when a user hits the Marketing folder,
for example, they may be hitting the Marketing file server. Then when they hit the Sales folder
they are hitting the Sales file server. They are different servers but from the perspective of the
user the folders appear in a single tree within a single namespace. Just as if they were
browsing Windows Explorer on their local machine. They can browse the Distributed File
System, or DFS, namespace across the network. And of course, one of the great benefits of
DFS is the replication piece, so we can make these files available to users not just on a single
server but we can replicate those files. And we can set preferences, so that users prefer to
connect to a DFS server in their own physical site...in their own location reducing overall WAN
bandwidth consumption.
[Users from Montreal and Scottsdale are accessing and receiving data from the DFS
Namespace server.]

When we install DFS, there are two role services that we can install: the DFS Namespace and
DFS Replication. The DFS Replication component is required on all DFS servers that will
participate in a DFS content or a hierarchy replication. If I am going to replicate it to other
boxes, I have got to have the replication component installed. If the other boxes are going to
accept the replication, I have got to have the replication component installed.

The DFS Namespace is the core component that lets me build that DFS hierarchy, right,
create those folders that point out to those folder targets on a variety of file servers and then
replicate that hierarchy to other machines, whether or not they have the content on them. So
the DFS Namespace, the hosting piece for the file...folder hierarchy and then DFS Replication
to replicate that out and the content to other servers.

[Users from Montreal and Scottsdale are accessing and receiving data from the DFS
Namespace server and the servers at Montreal and Scottsdale are replicating data.]

A Distributed File System hierarchy is composed of folders and folder targets. Now guys think
about this with me, the whole point of DFS is to create redundancy in the files failover to other
server...should access to the primary server be unavailable, right, for the user. And to
obfuscate the actual server infrastructure, I don't want them to know that they were attaching to
a different server. So what do I do? I create a namespace, right? We see that namespace up
there; that namespace root. And then I want to give structure to the namespace. So I create a
folder called Software, that is just a folder, right? It is a container object, there is nothing in it. It
doesn't point anything. Then beneath that I create new folders with folder targets. That Tools
directory for example, that points out to other DFS shares that contain the actual content.

[The Documents and Process Guides folders are in the \\Server1\Public root of the
Namespace server. The documents folder contains another folder WordDocs. The folder
targets for WordDocs are \\Branch1\WordDocs Montreal and \\Branch2\WordDocs Scottsdale.
The folder target for Process Guides is \\Branch3\Processes NY.]

3. DFS namespaces
When you create a DFS namespace, you are given two choices. I can either create a
standalone namespace or a domain-based namespace. Now guys when we do this, we always
create domain-based namespaces, why? Because the whole point of DFS is to be able to
replicate these files and folders to other locations, right? I want fault tolerance, I want
redundancy, I want a replication infrastructure. And with standalone DFS I don't get any of that.
So in the test environment, in the test lab on a clustered machine, right, so I had redundancy
and the server, I could certainly deploy DFS for testing purposes and that kind of thing.

Now when I do that, I am very limited in terms of what I can do. The namespace, for example,
always is the server name that is the RootNamespace. With domain-based DFS, I get all kinds
of options and I can create multiple namespaces. It could be the domain name as the root of
the namespace, it could be the NetBIOS domain name, the DNS domain name, right? I have
all these options, right? Additionally, I have the Active Directory, or AD, integration.
While in standalone DFS namespaces, everything is stored in the registry on the local
machine. The other thing to mention about this and you see it on the slide there, right?
Remember, SYSVOL replication in server 2008 uses DFS as its replication mechanism that is
preferred over traditional File Replication Services from earlier versions of the operating
system.

4. DFS candidates
When we think about DFS, you know, here are a couple of scenarios that we can use this thing
for, right? I can use it in server consolidation. You know as well as I do, one of the problems
we have had over the last 15 years in this business is server sprawl. Every department wants
its own server, right? They have got...they got their own things they want to do and we have
had lots of servers crop up over time. So today, you know, because hardware is so cheap and
we can make it so powerful, we may well do server consolidation. So what I can do is I can
setup DFS replicate all that content to a central location then decommission the old servers
that is one methodology.

Branch office data - you got users out on the branch office, they are creating files; files need to
get uploaded to headquarters. Well what do we do? We create DFS shares, they are same to
the shares they get replicated automatically back to HQ. And every morning when I come in
there they are from yesterday's work. Disaster recovery, or DR, scenarios...I just want to know
that my file server data has been replicated to the offsite DR location. So that in the event of
some catastrophic loss of the physical building everything is out there in the data warehouse.
Those are all potential solutions that we can employ DFS for.

[Share1 from Server01, share2 from Server2, and share3 from Server03 are consolidated to
DFS Namespace server in the first scenario, the data from branch offices are continuously
updated to the servers at headquarter in the second scenario, and the data from the Primary
site, or HQ, are backed up at DR Site to tackle disaster.]

5. DFS requirements
When we think about the requirements for DFS, you know, one of the great things is that it is
built-in with the operating system.

Graphic

Users from Montreal and Scottsdale are accessing and receiving data from the DFS
Namespace server.

So there is no additional licensing cost, I don't need any special hardware, I don't need any
extra software, right? It is just built right in.

And when we think about the Active Directory requirements,


Supplement

Selecting the link title opens the resource in a new browser window.

Job Aid

Use the Distributed File System in Windows Server 2012 R2 job aid to learn about
the Windows operating systems that support accessing DFS namespaces.

all of the servers in the replication group need to be in the same Active Directory forest and
they need to have the DFS Replication service installed.

Additionally when we think about the antivirus, we want to get a DFS-aware antivirus.
Something that is going to be compatible.

And only the NTFS file system is supported, so no other file systems and there is also no
support currently for cluster shared volumes.
Implementing DFS in Windows Server 2012 R2
Learning Objectives
After completing this topic, you should be able to
◾ recognize how to configure a DFS namespace in a given scenario
◾ identify the steps to create a DFS Replication group

1. Demo: Installing DFS


Now folks you can see here we are in the Add Roles and Features Wizard. We are going to
drill-down under File and Storage Services, File and iSCSI Services and that is where we
see DFS Namespaces. We want this machine to host the namespace and we are prompted
for additional features to install right through Administration Tools, I want the Administration
Tools. And I also want DFS Replication, I need this to be installed on every machine that is
going to participate in DFS Replication, go ahead and say Next.

We don't need any additional features at this time, we will go ahead and say Install. And now,
of course, guys two things to point out here, right? We know we are on a Server 2012 R2
machine because if you look in the lower left-hand corner, there is that Start button, right? This
doesn't take us to a Start menu but it takes us back to the Start screen and then we see that
there, and then that will toggle us back.

[Server Manager is open and the Add Roles and Features Wizard is displayed. In this wizard,
the navigation pane includes the following tabs: Before You Begin, Installation Type, Server
Selection, Server Roles, Features, Confirmation, and Results. From these tabs, Server Roles
is already selected and the Select server roles page is displayed. The Select server roles page
includes the Roles list box, which includes several checkboxes. From these checkboxes, the
File and Storage Services (4 of 12 installed) checkbox is expanded and two checkboxes, File
and iSCSI Services (3 of 11 installed) and Storage Services (Installed), are displayed. From
these checkboxes, Storage Services (Installed) checkbox is already selected. The instructor
expands the File and iSCSI Services (4 of 12 installed) checkbox and several checkboxes are
displayed, from which File Server (Installed) and File Server Resource Manager (Installed)
checkboxes are already selected. The instructor selects the DFS Namespaces checkbox and
the "Add features that are required for DFS Namespaces?" dialog box is displayed. This dialog
box displays tools that are required to manage the feature. These tools are Remote Server
Administration Tools, Role Administration Tools, File Services Tools, and DFS Management
Tools. The dialog box also includes the Include management tools (if applicable) checkbox,
which is already selected. The instructor clicks Add Features, the dialog box closes, and the
Select server roles page of the Add Roles and Features Wizard is displayed. Then from the
Roles list box, the instructor selects the DFS Replication checkbox and clicks Next. As a result,
the Features page of the wizard is displayed. The instructor again clicks Next and the
Confirmation tabbed page is displayed. Next the instructor clicks Install and the Results page
is displayed, which displays the installation progress. Then the instructor clicks the Start button
and the Start screen is displayed. The instructor navigates to the Server Manager in which the
installation progress of the features is displayed.]
And of course, today not only do we install roles and features here in the wizard, in the GUI,
but we can use the underlying PowerShell cmdlets as well. And so if we look in here, here is
the cmdlet that I want to know for DFS I want to script these installs, Install-
WindowsFeature is the cmdlet that will call, FS is for File Services, right, or file server.

DFS is the component or the role service of that file server role. And then the namespace is
the DFS component that we want to install. And of course, we want the replication component,
so we specify FS-DFS-replication as well. Then we need the Management console and
so we specify RSAT-DFS-Mgmt-Con for console, right? And that is going to give us
everything we need on this machine both to host the namespaces and replicate them off this
machine.

[The Add Roles and Features Wizard is open and in the Results page, the installation progress
of the features is displayed. The instructor navigates to Windows PowerShell, in which, at the
C:\Users\MJMurphy> prompt, the following command is entered: Install-WindowsFeature FS-
DFS-Namespace,FS-DFS-replication,RSAT-DFS-Mgmt-con]

2. Demo: Configuring DFS namespaces


Now that we have the DFS role installed, we can come on up here to Tools, down here to DFS
Management and this is the first time the DFS Management console is being launched. And
so we will see in here that nothing is configured. All right, I have got nothing to show in
here,nothing to show in here. Come up here to DFS Management and you can see we are
being prompted to add namespaces, add replication groups. All right, we are being prompted
to configure this thing, so we can come up here to the Namespaces node, hit New
Namespace and that will launch the Namespace Wizard.

First thing we got to do is we got to say what server is this namespace to be hosted on. And in
our example here that is this local machine Fileserver1 because I am real original. And
we are going to give this namespace a name, right? We will call it a Public. Over here, we
could specify the settings, so we could assign share permissions right in here, right? And that
is what this is – this is the share permissions, piece of this. And guys you know, right, by
default Microsoft recommends a default share permission of everybody Full-control and then
we lock the directory down with NTFS file permissions.

[Server Manager is open and the Dashboard tabbed page is displayed. The instructor opens
the Tools menu and selects the DFS Management command. As a result, the DFS
Management Console is displayed. The navigation pane of this console includes the node DFS
Management, which includes the nodes Namespaces and Replication. The instructor selects
the Namespaces node, which is empty and then selects Replication - which is also empty.
Then the instructor selects the node DFS Management and the Getting Started page is
displayed in the view pane. In the Getting Started page, the DFS Management Tasks section
includes a note on Publish Data to Multiple Servers, Collects Data for Backup Purposes, and
Manage Namespaces and Replication Groups. The Manage Namespaces and Replication
Groups section includes two hyperlinks, Add namespace to display an Add replication group to
display. In the navigation pane, the instructor right-clicks on the Namespaces node and from
the shortcut menu, clicks New Namespace command and the New Namespace Wizard is
displayed. In this wizard, the Namespace Server page is displayed, which includes the Server
text box and the Browse button. In the Server text box, the instructor enters the name
Fileserver1 and clicks Next. As a result, the Namespace Name and Settings page is displayed,
which includes the Name text box and the Edit Settings button that is disabled. In the Name
text box, the instructor enters the name Public and the Edit Settings button becomes enabled.
The instructor clicks Edit Settings and the Edit Settings dialog box is displayed. This dialog box
includes the Namespace server field, which is disabled and bears the name Fileserver1. The
dialog box also includes the Shared folder field, which is disabled and bears the name Public.
The Edit Settings dialog box contains the Local path of shared folder field, in which
C:\DFSRoots\Public is already entered. The dialog box also contains the Shared folder
permissions, which are as follows: All users have read-only permissions All users have read
and write permissions Administrators have full access; other users have read-only permissions
Administrators have full access; other users have read and write permissions Use custom
permissions From these permissions, All users have read-only permissions option is already
selected. The use custom permissions option has a Customize button, which is disabled.]

And if the purpose of this...everything in here was Read-only, if this was all standard operating
procedure forms, documentations, we could add a layer of security here by saying that All
users have read-only permissions. That is the default you can see that here or we could
come down here if we want to limit the scope of this, and we could specify what groups and
what level of share access they have. For our purposes, we are going to accept the defaults at
this point and go to next.

And here is the big thing, right? Is this a Stand-alone namespace, or a Domain-based
namespace? In our test environments, if you have got a, you know, limited scale virtual
environment that you are using to get ready to test, stand-alone namespaces are an
acceptable choice. You can see we have got a limit on the name of the namespace. The
namespace is always got this root name of the server name and that is the only choice we
have for that. And additionally, there is no replication with the stand-alone namespace, right?
And the whole point of this is commonly for replication. So we are going to choose Domain-
based namespace. Now there are a couple of things here, right? The root is going to be the
DNS domain name, but it could be the NetBIOS domain name, it could be the root domain
name, right? So we have got more choices there in terms of how we name this space.

[The New Namespace Wizard is open and the Edit Settings dialog box is displayed. In this
dialog box, under the Shared folder permissions section, the option All users have read-only
permissions is already selected. The instructor clicks OK, the dialog box closes, and the
Namespace Name and Settings page of the wizard is displayed. Then the instructor clicks
Next and the Namespace Type page of the wizard is displayed. This page allows you to select
the type of namespace to create and consists of the round button options Domain-based
namespace and Stand-alone namespace, from which Domain-based namespace is already
selected. The Domain-based namespace option includes the Enable Windows Server 2008
mode checkbox, which is already selected and the field Preview of domain-based namespace,
which is disabled and bears the path \\corp.brocadero.com\Public. The Stand-alone
namespace option includes the field Preview of stand-alone namespace, which is disabled and
bears the path \\Fileserver1\Public.]

And then, for you test takers out here you want to pay particular attention to this little button
right here – Enable Windows Server 2008 mode. Now, there are two things to know about
this guys. The first thing is what do I get if I enable this Windows Server 2008 mode? But what
I get is access-based enumeration. Access-based enumeration creates an environment in
which users only see the things that they have permissions to. Now anybody out there that
grew up in the novel world just thinking of themselves. Well jeez Murph, that is the way things
always were right of the box in novel, right? That is the way they ought to be if you don't have
permissions to it you should not be able to see it.

Today in Windows, we can do that by enabling access-based enumeration. And we get that
feature set here by enabling Windows Server 2008 mode. Now guys you might come in here
and you might see that this is grayed out. If it is grayed out, the reason it is grayed out is
because you don't meet the domain functional level requirement. You got to be running your
domain at 2008 mode minimally for this thing to work. The forest functional level can still be
2003, but the domain has to be 2008. We meet that requirement, so I will enable 2008 mode.
We see a review of the settings there. We go ahead and we create the namespace; we see
the namespace was created successfully. We close that up.

[The New Namespace Wizard is open and the Namespace Type page is displayed. In this
page, the round button option Domain-based namespace is already selected. This option
includes the Enable Windows Server 2008 mode checkbox, which is already selected. The
instructor clicks Next and the Review Settings and Create Namespace page of the wizard is
displayed. The instructor clicks Next and the Confirmation page of the wizard is displayed,
which displays the message "You have successfully completed the New Namespace Wizard."
The instructor clicks Close, the wizard closes, and the DFS Management Console is
displayed.]

Now we come over here and we see there is the namespace. Now again this is just the
namespace, right? We don't have any folders associated with this namespace yet. So we
come over here to New Folder and we can add a new folder. And we will call this folder
Standards. And the path to the folder target is here in our local machine and we see that
there is this corporate standards and operating procedures. That is going to be the target,
when people hit that Standards folder they actually go to the Corp SOP directory, right? Those
are the documents that they are going to hit. When we have added that folder and folder
target, we can come in here into the properties and you can see in here there is this Referrals
tab, Exclude targets outside of the client's site, Clients fail back to preferred targets.
What does this mean? Well one of the things that we want to do is we want to focus client
traffic to a DFS server that is local to them, right?

[The DFS Management Console is open and the Getting Started page is displayed. In the
navigation pane, the instructor expands the Namespaces node and the
\\corp.brocadero.com\Public node is displayed. The instructor selects the
\\corp.brocadero.com\Public node and its tabbed page is displayed in the view pane. Then
from the Actions pane, the instructor clicks New Folder and the New Folder dialog box is
displayed. In this dialog box, the instructor enters the name Standards in the Name field and
clicks Add, which displays the Add Folder Target dialog box. In the Add Folder Target dialog
box, the instructor clicks Browse and the Browse for Shared Folders dialog box is displayed,
within which the instructor expands the Share folder and selects Corporate SOP. The dialog
box closes and the Add Folder Target dialog box is displayed, in which the path
\\FILESERVER1\Share\Corporate SOP is displayed in the Path to folder target field. The
instructor clicks OK, the dialog box closes, and the New Folder dialog box is displayed, in
which the path \\FILESERVER1\Share\Corporate SOP is displayed in the Folder targets field.
The instructor clicks OK, the dialog box closes, and the tabbed page for
\\corp.brocadero.com\Public node is displayed. This tabbed page includes four tabs:
Namespace, Namespace Servers, Delegation, and Search. The Namespace tab is already
selected and its tabbed page displays the namespace name Standards. Next the instructor
right-clicks Standards and from the shortcut menu clicks Properties, which launches the
Standards Properties dialog box. This dialog box consists of three tabs: General, Referrals,
and Advanced. The General tab is already selected. The instructor clicks the Referrals tab,
which allows you to specify the amount of time that client cache (store) referrals for the folder.
In the Cache duration (in seconds) field, the number 1,800 is already entered. The Referrals
tab includes "Exclude targets outside of the client's site" and "Clients fail back to preferred
targets" checkboxes, from which the instructor selects the "Exclude targets outside of the
client's site" checkbox.]

So we could Exclude targets outside of the client's site. But if access to these documents is
more critical than bandwidth, if that server in the client site goes down – let's say I only have
one server in that client site, if that server goes down, I want them to failover to a server that is
outside their site, right? If access to the documents is more important than bandwidth, I don't
check that. If bandwidth is more important than the access to the documents, I check it.

Down here, Clients fail back to preferred targets, in the event that the target in their site has
gone offline, and now they start hitting a target that is in another remote site, when that local
target comes back online I want them to fail back to that preferred target. Over here on the
Advanced tab, I can use the inherited permissions from the local file system – a local NTFS
permissions or I can set explicit permissions in here. That is a look at the creation of
namespaces, adding folders and folder targets to those namespaces, and configuring those
folders.

[The DFS Management Console is open and the Standards Properties dialog box is displayed.
In this dialog box, the Referrals tab is already selected, within which the instructor has already
selected the "Exclude targets outside of the client's site" checkbox. The instructor clears the
"Exclude targets outside of the client's site" checkbox, selects the "Clients fail back to preferred
targets" checkbox, and then selects the Advanced tab that allows you to specify how DFS
Namespaces controls which users can see the folder. The Advanced tab comprises the round
button option "Use inherited permissions from the local file system," which is already selected,
and the round button option "Set explicit view permissions on the DFS folder." Then the
instructor clicks OK, the Standards Properties dialog box closes, and the Namespace tabbed
page for \\corp.brocadero.com\Public is displayed in the view pane.]

3. Demo: Creating replication groups


We have installed DFS. We have added a namespace folders, folder targets. Now we want to
replicate that content to other servers. And so, here I can say New Replication Group, we are
going to configure the servers that will participate in this Replication group. And you can see
there are two choices here: Multipurpose replication group or Replication group for data
collection. The Replication group for data collection is a great example of a choice that I
would make for collecting data in a hub and spoke topology.

For example, I have remote sites, those remote sites have DFS servers, and the users in those
remote sites generate content and save it to those servers. Then I replicate that content back
to HQ, where it is collated and it is processed in-house at HQ. And for my Disaster Recovery
sites right from my off-site DR, that is a great choice too, right? That is an example of where I
replicate everything from the off-sites to the data warehouse that would be the hub of my hub
and spoke topology. And in the data warehouse, we have a failover solution in the event of
Internet connectivity issues or a little bit loss of a particular site.

[The DFS Management Console is open and the tabbed page for Replication node is
displayed, which is empty. The instructor right-clicks the Replication node and from the
shortcut menu, clicks New Replication Group, Which launches the New Replication Group
Wizard. The Replication Group Type page of the wizard is displayed, which allows you to
select a replication group from the available two types, Multipurpose replication group and
Replication group for data collection. Multipurpose replication group option configures
replication between two or more servers for publication, content sharing, and other scenarios.
Replication group for data collection configures two-way replication between two servers, such
as a branch server and a hub (destination) server, which allows you to collect data at the hub
server and then back up the data on the hub server. Multipurpose replication group option is
already selected, for which the following steps are displayed in the navigation pane: Name and
Domain Replication Group Members Topology Selection Hub Members Hub and Spoke
Connections Replication Group Schedule and Bandwidth Primary Member Folders to
Replicate Review Settings and Create Replication Group Confirmation The instructor selects
Replication group for data collection option and the following steps are displayed in the
navigation pane: Name and Domain Branch Server Replicated Folders Hub Server Target
Folder on Hub Server Replication Group Schedule and Bandwidth Review Settings and Create
Replication Group Confirmation]

Up here multipurpose replication group, this is the standard DFS topology and this is going to
let me replicate between all my DFS servers. Now in our example here, guys I only got the
two. But I could add additional servers at any time and know that any changes made at any
one of those would be replicated to all the partners in the Replication group. So we are going
to choose Multipurpose replication group – the DFS standard. What is the name of this
Replication group? This is the SOP Replication group because the function of this Replication
group is to replicate corporate standard operating system procedures to all sites. So that
copies of those standard procedure documents can be found locally on the servers in every
office. That is the idea here.

Who participates in replication? Well every server that is going to participate in replication has
to have the DFS replication component installed. And you guys remember, right, that is the
install Windows feature – FS DFS replication. And here are the servers in...on my network that
I have those services installed on.

[The New Replication Group Wizard is open and the Replication Group Type page is
displayed. In this page, Replication group for data collection option is already selected and the
steps to create the replication group are displayed in the navigation pane. The instructor
selects the Multipurpose replication group option and clicks Next. As a result, the Name and
Domain page of the wizard is displayed, within which the instructor enters the name SOP in
the Name of replication group field, and then enters the text Replicate Corp SOP to all sites in
the Optional description of replication group field. Then the instructor clicks Next and the
Replication Group Members page of the wizard is displayed. In this page, the instructor clicks
Add and the Select Computers dialog box is displayed, within which the instructor enters the
name Fileserver1 in the Enter the object names to select (examples) field and clicks OK. The
Select Computers dialog box closes and the Replication Group Members page of the wizard is
displayed. This page includes the Members section, which consists of two columns, Server
and Domain. The Server column has a single row entry FILESERVER1, and the Domain
column has the entry corp.brocadero.com. The instructor again clicks Add and the Select
Computers dialog box is displayed, within which the instructor enters the name SERVER7 in
the Enter the object names to select (examples) field and clicks OK. The dialog box closes and
in the Replication Group Members page of the wizard, in the Members section, the Server
column now has a second row entry SERVER7 and the Domain column has a second row
entry corp.brocadero.com. Then the instructor clicks Next and the Topology Selection page of
the wizard is displayed, which allows you to select a topology of connections among members
of the replication group. The Topology Selection page includes three options, Hub and spoke,
Full mesh, and No topology. From these options, Full mesh is already selected.]

Here what kind of topology do we want? There is No topology, which means no replication
will take place or the Full mesh for my multipurpose replication groups it is going to be Full
mesh. If we had made the other choice it would be Hub and spoke. So for our purposes, we
are going to say Full mesh.

Now do I Replicate continuously using the specified bandwidth? So every time there is a
change, use whatever bandwidth is available or maximize the bandwidth, right? So you got
bandwidth throttling built right into this. So for those of us that are concerned about the net
available bandwidth, that is a great choice. And then Replicate during the specified days
and times – so I could edit a schedule, I could create a replic...if I really got replication
concerns, and I want replication to happen in those off hours, right? I want that replication to
happen at night because we just don't have the bandwidth for it to replicate during the
production schedule, right? During the production day, I could come in here and I could specify
that replication time.

You know on the weekends maybe we allow for Full replication all day and all night because
we just are not here, right? There is plenty in that available bandwidth over the weekends. The
only time that concerns me is during the business day. And so we could set a schedule in that
fashion, right? And of course the bandwidth throttling is built right in. For our purposes here in
the test environment, we will replicate continuously.

[The New Replication Group Wizard is open and the Topology Selection page is displayed,
which helps you select a topology of connection among members of the replication group. In
this page, Full mesh is already selected, in which each member replicates with all other
members of the replication group. The instructor clicks Next and the Replication Group
Schedule and Bandwidth page of the wizard is displayed, which allows you to select the
replication schedule and bandwidth to be used by default for all new connections in the
replication group. This page includes two options, Replicate continuously using the specified
bandwidth and Replicate during the specified days and times. The Replicate continuously
using the specified bandwidth option is already selected, which includes the Bandwidth drop-
down list from which Full is already selected. The instructor selects the Replicate during the
specified days and times option and the Edit Schedule button is enabled, which the instructor
clicks and the Edit Schedule dialog box is displayed. This dialog box includes a table that
consists of eight columns with headings All, Sunday, Monday, Tuesday, Wednesday,
Thursday, Friday, and Saturday. The table also consists of 24 columns, from which the
heading of first and last column is 12 AM, the heading of second column is 2 AM, and the
heading of each subsequent column is an increased time with a difference of two hours and a
12-hour time format. From the table, the instructor selects the entire time of a day for Sunday
and Saturday, and for Monday through Friday, the instructor selects 12 AM to 7 AM and 5 PM
to 12 AM. Then from the Bandwidth usage drop-down list, the instructor clicks Full, then clicks
OK, the Edit Schedule dialog box closes, and the Replication Group Schedule and Bandwidth
page of the wizard is displayed. Then the instructor clicks Next and the Primary Member page
of the wizard is displayed.]

Who is the primary member? Well this is the server that is currently hosting the content and we
are going to replicate that content out to the other members of the Replication group. What
folders are we replicating? Well on the local machine we know that in Share there is a folder
called Corporate SOP, and that is the content that we actually want to replicate out. So I grab
that folder there.

Now when it gets to the other server, what is the path in which I want it to be saved? And so on
that remote server, we can see there is a top level folder called Share and we want all of our
DFS shares and folder targets to be stored in there. I could choose...now this is a great feature
guys and this is one of the things that makes DFS so much better today than it ever was in the
past. I can Make the selected replicated folder on this member read-only. See here is the
problem with DFS, right? So many people tried to use it as a collaboration solution and it is just
not that. So today I can lock those remote copies down and make them read-only. If I need to,
right?

[The New Replication Group Wizard is open and the Primary Member page is displayed, which
allows you to select the server that contains the content you want to replicate to other
members. In this page, from the Primary member drop-down list, the instructor clicks
FILESERVER1 and clicks Next. As a result, the Folders to Replicate page of the wizard is
displayed, which helps you select a folder on the primary member that you want to replicate to
other members of the replication group. In this page, the Replicated folders section is empty.
The instructor clicks Add and the Add Folder to Replicate dialog box is displayed. In this dialog
box, from the "Local path of folder to replicate" section, the instructor clicks Browse and the
Browse For Folder dialog box is displayed, from which the instructor expands Share and then
selects the Corporate SOP folder. Then the instructor clicks OK, the Browse For Folder dialog
box closes, and the Add Folder to Replicate dialog box is displayed, within which the path
C:\Share\Corporate SOP is displayed in the "Local path of folder to replicate" field. The Add
Folder to Replicate dialog box includes the options "Use name based on path" and "Use
custom name", from which "Use name based on path" is already selected and the name
Corporate SOP is displayed in its field. Next the instructor clicks OK, the Add Folder to
Replicate dialog box closes, and the Folders to Replicate page of the wizard is displayed,
within which, in the Replicated folders section, Local Path is displayed as C:\Share\Corporate
SOP with Replicated Folder Name Corporate SOP and NTFS Permissions as Use existing
permission. Then the instructor clicks Next and the Local Path of Corporate SOP on Other
Members page of the wizard is displayed, which allows you to select the appropriate member.
In this page, the Member details section displays the member SERVER7 wit Membership
Status Disabled. The instructor clicks Edit and a dialog box is launched, which allows you to
select the initial status of the replicated folder on the member. The Membership status includes
two options, Disabled and Enabled, from which Disabled is already selected. The instructor
selects Enabled and the "Local path of folder" field is enabled along with the Browse button.
The instructor clicks Browse and the Browse For Folder dialog box is displayed, from which the
instructor selects Share and clicks OK. As a result, the dialog box closes and the path
C:\Share is displayed in the "Local path of folder" field of the previous dialog box. The dialog
box includes the "Make the selected replicated folder on this member read-only" checkbox,
which the instructor selects.]

And that will minimize conflicts and in our case that is a good choice to make. All right, we get
a review of the settings there, we can go ahead and create the Replication group; we see the
Replication group created. Replication will not begin until the configuration is picked up by the
members of the Replication group. The amount of time this takes depends on Active Directory
Domain Services replication latency.

Now in our example here, it should not take that long. If I right-click this, I can see that I have
some options. I can come into the Properties for this Replication group, I could edit that
schedule, right? If it turns out later on that that schedule is no good, we are consuming too
much bandwidth, I could scale that schedule back, right? We could come in here and we could
say, during business hours, we are just not going to let replication happen, bang! Over here,
there is a choice for New Topology, if I wanted to recreate the topology. If I didn't want that full
mesh topology, I wanted to move to a hub and spoke kind of topology, I could edit that in here.
For our purpose I am going to say No, we are going to keep it that way. That is a look guys at
the creation and management of Replication groups.

[The New Replication Group Wizard is open and a dialog box is displayed, within which, in the
Membership status, the Enabled option is already selected, the path C:\Share is displayed for
"Local path of folder" and the "Make the selected replicated folder on this member read-only"
checkbox is already selected. The instructor clicks OK, the dialog box closes, and the Local
Path of Corporate SOP on Other Members page of the wizard is displayed. In this page, the
Primary member is displayed as FILESERVER1 and the Primary member local path is
displayed as C:\Share\Corporate SOP. In the Member details section of the page, the member
SERVER7 is displayed with Local Path C:\Share and Membership Status as Enabled (read-
only). The instructor clicks Next and the Review Settings and Create Replication Group page
of the wizard is displayed, in which the instructor clicks Create and the Confirmation page of
the wizard is launched, which displays the message "You have successfully complete the New
Replication Group Wizard." Then the instructor clicks Close and the Replication Delay dialog
box is launched, which displays the message "Replication will not begin until the configuration
is picked up by the members of the replication group. The amount of time this takes depends
on Active Directory Domain Services replication latency as well as the polling interval." The
instructor clicks OK, the Replication Delay dialog box closes, and the DFS Management
Console is displayed, in which the Replication node is already selected and in the Replication
tabbed page, the Name SOP is displayed. Then the instructor right-clicks SOP and from the
shortcut menu, clicks Properties. As a result, the SOP Properties dialog box is displayed, in
which the instructor clicks Edit Schedule and the Edit Schedule dialog box is displayed. In the
dialog box, in the table of schedule, the entire time of a day is already selected for all days of
the week, and from the Bandwidth usage drop-down list, Full is already selected. In the table,
for Monday through Friday, the instructor selects 6 AM to 6 PM and then from the Bandwidth
usage drop-down list, clicks No replication. Next the instructor clicks OK, the Edit Schedule
dialog box closes, and the SOP Properties dialog box is displayed, in which Replication group
is displayed as SOP, Description is displayed as Replicate Corp SOP to all sites, and Domain
is displayed as corp.brocadero.com. Then the instructor clicks OK, the SOP Properties dialog
box closes, and the Replication tabbed page of the DFS Management Console is displayed.
Next from the Actions pane, from the SOP node, the instructor clicks New Topology and the
Warning dialog box is launched, which displays the message "All existing connections (2) will
be deleted at the end of the New Topology Wizard so that the new topology can be created.
Do you want to continue?" The instructor clicks No, the Warning dialog box closes, and the
Replication tabbed page of the DFS Management Console is displayed.]

4. Demo: Staging and compression


There are two important concepts to be clear on and that is the question of Remote Differential
Compression, or RDC, and staging. And the first thing is staging, right? Let's take a look at
this. Now guys we know right...so here is a folder that is part of a DFS Replication group. And
if I come in here on the Properties for this share, I see that there is this Staging tab. And what
we see here is the path – the local file path to the Staging directory, and the size – the
maximize size that that directory can be.

Now guys here is the gig, right? This is why this is important, every time somebody changes a
file on the DFS server, we want to replicate those changes to all of the other members of the
Replication group. But we don't want to limit the amount of access that other users have to that
file, right? And there is going to be an open file handle when it is in the process of copying, and
so users are going to be able to get only read-only access to it. So what we do is we copy that
to a Staging folder locally, right? We copy that file to a Staging folder locally.

[The DFS Management Console is open, in which the SOP node is selected and the SOP
tabbed page is displayed, which includes four tabs: Memberships, Connections, Replicated
Folders, and Delegation. The Memberships tab is already selected, which includes six
columns: State, Local path, Membership Status, Member, Replicated Folder, and Staging
Quota. The Memberships tab includes two entries within the Replicated Folder: Corporate
SOP group. These entries are listed in the six columns as follows: for the Local Path
C:\Share\Corporate SOP, Membership Status is Enabled, Member is FILESERVER1,
Replicated Folder is Corporate SOP, and Staging Quote is 4.00; for the Local Path C:\Share,
the Membership Status is Enabled (read-only), Member is SERVER7, Replicated Folder is
Corporate SOP, and Staging Quota is 4.00. The instructor right-clicks the Membership Status
for C:\Share\Corporate SOP and from the shortcut menu, clicks Properties, which launches the
FILESERVER1 (Corporate SOP) Properties dialog box. This dialog box consists of four tabs:
General, Replicated Folder, Staging, and Advanced, from which the General tab is already
selected. The instructor clicks the Staging tab, within which, in the Staging path field, the path
C:\Share\Corporate SOP\DfsrPrivate\Staging is displayed and in the Quota (in megabytes)
spin box, the value 4096 is already entered.]

And then the actual file is available for other folks to open, work with, edit, and whatever they
have to do. And the copy...the most recent changes are sitting in the Staging directory waiting
to be replicated. And so guys, when we look at this thing we see that there is this, you know,
upper limit, 4096 megabytes, right? So that is a 4 GB limit. Now I may know that I don't need
that much size, right? I don't need it to be that big. So I could scale this thing down, you know,
if I wanted to, you know, let's say something like that. When we think about staging, we can
also configure the minimum file size for staging to help improve performance.

So for example, the default size is that any file over 64 kilobytes are staged. Now that is only
when remote differential compression is enabled and we will talk about remote differential
compression coming right up here. There is a PowerShell cmdlet, the set Distributed File
System Replication, or DFSR, membership cmdlet with the minimum file stage sized
parameter that you can use to configure the lower limit for files that should be staged, right?

[The DFS Management Console is open and the FILESERVER1 (Corporate SOP) Properties
dialog box is displayed. In this dialog box, the Staging tab is already selected and in the
Staging path field, the path C:\Share\Corporate SOP\DfsrPrivate\Staging is displayed. In the
Quota (in megabytes) spin box, the value 4096 is already entered, which the instructor
changes to 2048.]

So what is this remote differential compression that this thing is dependent on. Well if we come
over here to the Connections tab I see the servers, and these servers have connections to
each other, right? And they replicate across those connections. Oh I should show you that,
right? There is this choice here, right, Replicate Now. So I can force that replication to
happen, right? That is an...this is an example of manual replication between the two.

In our example, I just want to go into the Properties and you can see there you could change
the schedule, or over here I see this two – Enable replication on this connection. So I want
to replicate it across this connection. And then here, Use remote differential compression. And
remote differential compression is a great thing when we think about saving bandwidth, but it is
not a great thing when we think about saving processor capacity. So if my processor on the
DFS server is a bottleneck for me, I don't want to consume more processor cycles but I have
plenty of available bandwidth, I choose not to use remote differential compression, right?

[The DFS Management Console is open and the FILESERVER1 (Corporate SOP) Properties
dialog box is displayed. In this dialog box, the Staging tab is already selected and in the
Staging path field, the path C:\Share\Corporate SOP\DfsrPrivate\Staging is displayed, and in
the Quota (in megabytes) spin box, the value 2048 is already entered. The instructor clicks
Cancel, the dialog box closes, and the Memberships tabbed page is displayed. Then the
instructor clicks the Connections tab, which includes the six columns State, Sending Member,
Sending Site, Connection Status, Receiving Member, Receiving Site, and Schedule Type. The
Connections tab also includes two entries, one in the Sending Member: FILESERVER1 group
and the other in the Sending Member: SERVER7 group. These entries are listed in the six
columns as follows: for the Sending Member FILESERVER1, the Sending Site is NYC,
Connection Status is Enabled, Receiving Member is SERVER7, Receiving Site is NYC, and
Schedule Type is partially visible Replication G; for the Sending member SERVER7, the
Sending Site is NYC, Connection Status is Enabled, Receiving Member is FILESERVER1,
Receiving Site is NYC, and Schedule Type is partially visible Replication G. The instructor
right-clicks Connection Status for FILESERVER1 and from the shortcut menu, clicks
Properties, which launches the FILESERVER1 to SERVER7 Properties dialog box. The
instructor closes the dialog box and then again right-clicks the Connection Status for
FILESERVER1 and from the shortcut menu, clicks Replicate Now, which displays the
Replicate Now dialog box. The instructor clicks OK, the dialog box closes, and the Resume
Schedule Successful dialog box is displayed, which includes the message "The standard
replication schedule of selected connections is successfully resumed." The instructor clicks OK
and the dialog box closes. Then the instructor right-clicks FILESERVER1 and from the shortcut
menu, clicks Properties, which launches the FILESERVER1 to SERVER7 Properties dialog
box. This dialog box includes two tabs, General and Schedule, from which the General tab is
already selected. The instructor selects the Schedule tab and navigates back to the General
tab, which includes the "Enable replication on this connection" and "Use remote differential
compression (RDC)" checkboxes, which are both selected. From these checkboxes, the
instructor clears the "Use remote differential compression (RDC)" checkbox.]

Meanwhile, if I have plenty of spare processor capacity but I am worried about the amount of
bandwidth that replication takes up, I use remote differential compression. It is worth pointing
out guys that a function of remote differential compression is crossfile RDC and that is just a
thing of beauty, and this has been a lot round for long time, right? So if you have been using
DFS for a while you are probably familiar with this. But if I look here, I can see that there is this
PowerShell cmdlet, set-dfsrConnection -DisableCrossFileRDC. And I can...that
is a binary value, right? if I hit 1 then I am going to disable that remote differential
compression, that crossfile remote differential compression.

[The DFS Management Console is open and the FILESERVER1 to SERVER7 Properties
dialog box is displayed, in which the "Enable replication on this connection" checkbox is
already selected. The instructor selects the "Use remote differential compression (RDC)"
checkbox and clicks OK. As a result, the dialog box closes and the Connections tabbed page
for SOP is displayed. Then the instructor navigates to Windows PowerShell, in which, at the
C:\Users\administrator.CORP> prompt, the command set-dfsrConnection -
DisableCrossFileRDC 0 is entered. In this command, the instructor changes 0 to 1 and
executes the command, and the following output is displayed: cmdlet Set-DfsrConnection at
command pipeline position 1 Supply values for the following parameters:
SourceComputerName: set-dfsrConnection -DisableCrossFileRDC 1]

And here I can just specify the computer that this is going to happen on, Fileserver1 is the
source; Server7 is the destination. We say okay there and there I see that minimum RDC
file size, right? That is the minimum file size for staging. The default is 64. If I want to change
that I could use that Set-DfsrMembership cmdlet with the minimum file staged size
parameter or if we look up here CrossFileRdcEnabled: False, right? I just turned if off. Now we
could turn it back on again if I set that to 0. And why do I want to do this? I want to do this...I
want this thing enabled in almost all cases because it is going to save me replication cost.
Here is why, what Cross File RDC does is...let's say I copy a PowerPoint and I add two slides
to it, right? Cross File RDC identifies that the file was based on an existing file and requests
only the changes. That is a thing of beauty.

[Windows PowerShell is open and at the C:\Users\administrator.CORP> prompt, the command


set-dfsrConnection -DisableCrossFileRDC 1 has already been run and the following output is
displayed: cmdlet Set-DfsrConnection at command pipeline position 1 Supply values for the
following parameters: SourceComputerName: set-dfsrConnection -DisableCrossFileRDC 1
The instructor changes the parameters to: SourceComputerName: Fileserver1
DestinationComputerName: Server7 Then the instructor hits Enter on keyboard and the
following output is displayed: GroupName : SOP SourceComputerName : FILESERVER1
DestinationComputerName : SERVER7 DomainName : corp.brocadero.com Identifier :
45fad3e9-fdf7-47ff-833b-dc0d01d9d279 Enabled : True RdcEnabled : True
CrossFileRdcEnabled : False Description : MinimumRDCFileSizeInKB : 64 State : Normal
Next at the C:Users\administrator.CORP> prompt, the instructor runs the command set-
dfsrConnection -DisableCrossFileRDC 0 and the following output is displayed: cmdlet Set-
DfsrConnection at command pipeline position 1 Supply values for the following parameters:
SourceComputerName: The instructor sets the SourceComputerName to Fileserver1 and the
DestinationComputerName to Server7, which displays the following output: GroupName :
SOP SourceComputerName : FILESERVER1 DestinationComputerName : SERVER7
DomainName : corp.brocadero.com Identifier : 45fad3e9-fdf7-47ff-833b-dc0d01d9d279
Enabled : True RdcEnabled : True CrossFileRdcEnabled : True Description :
MinimumRDCFileSizeInKB : 64 State : Normal Next the instructor navigates to the DFS
Management Console and the Connections tabbed page for SOP is displayed.]
Managing DFS in Windows Server 2012 R2
Learning Objective
After completing this topic, you should be able to
◾ recognize the implications of a DFS database shutting down unexpectedly in
Windows Server 2012 R2

1. Using PowerShell
More and more our ability to manage services and servers and Windows is moving over to
PowerShell. So guys look, if you are not looking at PowerShell you have got to be looking at
PowerShell. PowerShell is a simple noun-verb agreement language. There is always a verb-
noun, and there are two principle objects that we manage in Distributed File System, or DFS,
right? DFS Namespaces and DFS Replication, and so I have got a host of cmdlets for
managing DFS servers and DFS Namespaces. And then a host of cmdlets for managing DFS
Replication.

You know if we take a look at some of these, take a look at what the verbs are, right? Get will
always return information to me. Set will always set a configuration setting. Grant will grant
rights or permissions over the object specified, right? So for example, I have that grantDFS
and accessDFS namespace access. I want to grant access to that namespace, right? New to
create a new folder or a new folder target or in replication to create a new DFS Replication
group. Remove will remove that object, right? This is a look at some of the PowerShell
cmdlets, their structure. But guys for the sake of your career you have got to be looking at
PowerShell.

[The DFSN functions are as follows: Get-DfsnAccess Get-DfsnFolder Get-DfsnFolderTarget


Get-DfsnRoot Get-DfsnRootTarget Get-DfsnServerConfiguration GrantDfsnAccess Move-
Dfsnfolder New-DfsnFolder New-DfsnFolderTarget New-DfsnRoot New-DfsnRootTarget
Remove- DfsnAccess Remove-Dfsnfolder Remove-DfsnFolderTarget Get-DfsnRoot Remove-
DfsnRootTarget Revoke-DfsnAccess Set-DfsnFolder Set-DfsnRoot Set-DfsnRootTarget Set-
DfsnServerConfiguration The DFSR cmdlets are as follows: Add-Dfsrconnection Add-
dfsrmember ConvertFrom-DfsrGuid Export-DfsrClone Get-DfsrBacklog Get-DfsrCloneState
Get-DfsrConnection Get-DfsrConnectionSchedule Get-DfsrReplicatedFolder Get-
ReplicationGroup Get-ReplicationGroup Get-DfsrIDRecord Get-DfsrMember Get-
DfsrMembership Get-DfsrPreservedFiles Get-DfsrServiceConfiguration Get-DfsrState Import-
DfsrClone New-DfsReplicatedFolder New-DfsReplicationGroup Remove-DfsrConnection
Remove- DfsReplicatedFolder Remove- DfsReplicationGroup Remove –DfsrMember Remove-
DfsrpropagationTestFile Reset-DfsrcloneState Restore-DfsrPreservedFiles Set-
DfsrConnection Set-DfsrConnectionSchedule Set-DfsrreplicatedFolder Set-ReplicationGroup
Set-DfsrgroupSchedule Set-DfsrMember Set-DfsrMemberShip Set-DfsrPropagationTest
Suspend-DfsrReplicationGroup Sync-Dfs ReplicationGroup Update-DfsrConfigurationfromAD
Write-DfsrHealthReport Write-DfsrPropagationReport]
2. Demo: Database cloning
Now here is the reality, in any good multimaster replication model, I have got to synchronize
the metadata between files so that the nonauthoritative servers know that they have the right
copies of the files, right guys? And the reality is that when you add a new server to a DFS
Replication group, the initial build of that data even if you preseed, right? And we know what
preseeding is – I copy the files there. You would think that doing that right...just copying the
files would save a lot of time and it does, right? Rather than letting them happen through
replication, right, I burn them to disk, I port them over. But there is still all that metadata, the
database has to be built. And so with database cloning combined with pre-seeding, we can
reduce that initial replication time – tenfold readily.

And think about it, we don't just do this when we add a new server, right guys? What are the
conditions under which we add new servers or we would want to clone: If we are doing an
upgrade, if we are in an engage and recovery process, we had a server fail. we did a
replication redesign, or we are just decommissioning the 2003, 2008 boxes and replacing them
with 2012.

Ladies and gentlemen, one of the new features of Windows Server 2012 R2 is the ability to
clone the DFS database. And we want to make the distinction clear here guys, that is not the
content, right? Before I can do any of this, I want to copy the content using Robocopy or
Windows Backup or, you know, whatever it is to copy the files from the primary server where
they are currently hosted to that replica DFS server out there someplace, which is commonly in
another office, right? It is in a geographically disparate location. And so I don't...if I got four GB
of data on that primary server, I don't want to have that all replicated over the WAN links, right?
Who knows how long it is going to take.

So we pre-seed those content directories and that is a separate process. Once that pre-
seeding has taken place, we can then clone the DFS database, which tracks all the files that
are in that content directory – the changes to those files, right? And that is a separate thing
entirely from the content. Now how do we do that? Well when you look at the documentation
the first thing that it says is to create a Replication group that has only one member.

[The Windows PowerShell is open and the following variable is displayed:


$dfsrComputerName="Filesserver1.corp.brocadero.com"]

Now for you guys that spend all your time in the GUI, right, you know, if you go into the GUI,
right, one of the first things that you come into here is this, we go ahead and we add the
primary server fileserver1 in this example. And we try to go to next; we can't, right? Can't
do it in the GUI. So guys if you have been putting off PowerShell, right, here is an opportunity
to learn it because the database cloning is definitely on the test. So you definitely want to
spend some time with that. So how do I create this Replication group in PowerShell first, so
that I can then export the DFS database. I have to set some variables. Did you see that $ sign
there in front of dfsrComputerName? That $ sign says I am defining this variable.

The variable is the computer name, the name is Fileserver1.corp.brocadero.com,


right? There are other variables that I have to define. I have to define the replication folder
path, so I do variable dfsrReplicationFolderPath and I specify the path to the share,
right? Then I need...oh sorry. Let me do this so you can see them all here, right? Then I need
the Replication group name and I give it a name. We will call it RG1, Replication group 1.
[The Windows PowerShell is open and the following variable is displayed:
$dfsrComputerName="ilesserver1.corp.brocadero.com." The instructor opens the New
Replication Group Wizard and clicks Next. As a result, the Replication Group Members page of
the wizard is displayed. The partially displayed page contains a Members section and an Add
button. The instructor clicks the Add button and the Select Computers dialog box is displayed.
The Select Computers dialog box includes fields as follows: Select this object type, From this
location, and Enter the objects name to select. Next the instructor enters fileserver1 in the
Enter the object names to select field and clicks the Check Names button and then OK, and
navigates back to the Replication Group Members page of the wizard. Next the instructor
clicks Cancel to close the page and opens the Windows PowerShell. The instructor then enters
the following variables in the PowerShell window: $dfsrreplicationFolderPath="C:\Share\REPl
$dfsrReplicationGroupName='RG1"]

Then I need the replicated folder name and that replicated folder is called REPL in this
example, right? Now when I have done all that...well hang on. When I have defined these
variables in PowerShell then I call the NewDFSReplicationGroup -GroupName and I
specify the variable. So we are going to create a new Replication group and what name will it
have RG1, the variable that we specified above there. Then we call the New-
DFSRReplicatedFolder cmdlet specify the GroupName again using the variables, the
FolderName again using the variables that we defined above. And then finally we Add the
DfsrMember to the GroupName with the variables specified above and the
Computername again with the variable specified above. Finally, we Set the
dfsrMembership, right? Again calling the variables and we specify that this machine is the
PrimaryMember.

[The Windows PowerShell is open and the following variables are displayed:
$dfsrComputerName="Filesserver1.corp.brocadero.com"
$dfsrreplicationFolderPath="C:\Share\REPl $dfsrReplicationGroupName='RG1"
$dfsrReplicationFolderName='REPL" NewDFSReplicationGroup –GroupName
$dfsrrelicationGroupname New-DFSRReplicatedFolder –Groupname
$dfsrRelicationGroupname –FolderName $FolderName $dfsrReplicatedFolderName Add-
Dfsrmember –GroupName $dfsrReplicationgroupName –Computername $dfsrComputername
Set-dfsrMembership –GroupName $dfsrReplicationgroupname –Foldername
$dfsrreplicationFoldername –contentpath $dfsrreplicatedFolderPath –Computername
$dfsrComputername –Primarymember $true]

Now that we have done that we have met the minimum requirements to come over here and
actually execute the Export-dfsrClone cmdlet. So if we take a look in here the first thing
that we do is we get the Windows Management Instrumentation, or WMI, object, right? What is
the WMI object? The namespace...the DFS namespace and we want to understand what the
current replication status is. There have to be replicated folders in a normal state for us to
execute this process and we see in fact that number 4 there that state indicates that they are in
normal replication status.

[The Windows PowerShell is open and the following variables are displayed:
$dfsrComputerName="Filesserver1.corp.brocadero.com"
$dfsrreplicationFolderPath="C:\Share\REPl $dfsrReplicationGroupName='RG1"
$dfsrReplicationFolderName='REPL" Set-dfsrMembership –GroupName
$dfsrReplicationgroupname –Foldername $dfsrreplicationFoldername –contentpath
$dfsrreplicatedFolderPath –Computername $dfsrComputername –Primarymember $true The
instructor opens another PowerShell window and executes the following code: Get-wmiobject
–nameSpace "root\microsoft\windows\dfsr" –class msft_dfsrreplicatedFolderinfo
–computername Fileserver1 I ft replicatedfoldername, state –auto –wrap]

Now once we have done that we have got to specify some variables again,
dfsrCloneVolume. What volume are we cloning? In this example, it is the C volume. What
is the path to the folder? The dfsrClone path, I see it specified there. I specify the
dfsrCloneDirectory. New-Item -Path $dfsrclonevolume\$dfsrCloneDir
-Type Directory, and finally we can do Export-dfsrClone and we specify the -
Volume based on the parameter...based on the variables that we defined above.

So guys I got to do all that before I can export it and get that XML file. Then what do I do? I
import...I do the Import-dfsrClone on the target machine that is already been pre-
seeded with the content and I add that machine to the Replication group. The practical impact
of this is that I cut down the initial synchronization time from what could perhaps be days or
even weeks to minutes or hours.

[The Windows PowerShell is open and the following code run: Get-wmiobject –nameSpace
"root\microsoft\windows\dfsr" –class msft_dfsrreplicatedFolderinfo –computername Fileserver1
I ft replicatedfoldername, state –auto –wrap The instructor calls the following variables:
$dfsrCloneVolume="c:" $dfsrClonedirectory="\dfsrClone" New-Item –Path
$dfsrcloneVolume\$dfsrClonedir –Type Directory Export-dfsrclone –Volume
$dfsrCloneVolume\DFSRCloneDir]

3. Database recovery
When you have a power loss, when you have a hardware failure, when the machine just shuts
down, you know what happens, right? That...we call that a dirty shutdown. Now in previous
versions of DFS – in 2012, 2008 R2 – an unexpected shutdown required you to re-enable
replication manually using a Windows Management Instrumentation, or WMI, method. But in
2012 R2, DFS automatically validates the database against the file system and then resumes
replication normally setting any file replication conflicts normally.

Now guys you got to appreciate how different database recovery is in Server 2012 R2 than
from everything that ever came before, right? In every other previous version of DFS – what
we had if we had: a power failure, disk controller failure, hardware, right? Whatever it was,
when we recover from that dirty shutdown all the files that were effected get mort with the
FRS_FENCE_INITIAL_SYNC (1) flag. Which means essentially we are waiting for outbound
partner replication. And any changes that didn't replicate are lost and that content gets moved
to the ConflictAndDeleted or the PreExisting folders.

Now that totally changes in 2012 R2, the effected files are fenced with the
FRS_FENCE_INITIAL_SYNC (3) flag. Which means that they are essentially normal, right,
and so normal replication happens. In the first scenario, just imagine the consequences of a
failure of the power on the subnet – that all your DFS servers run or all your DFS servers are
virtualized – and the Hypervisor host fail. Now when the DFS servers come back up, none of
them are in a normal shutdown state. They are all in a dirty state and replication can never get
started because they are all waiting for the others to say that they are normal. That does not
happen today. Today we are in a position to recover all those files.
Now before 2012 R2, what would happen when a dirty shutdown was detected? Was it getting
a warning log to the Event Viewer – event ID 2213? And essentially, what 2213 said was that
the DFS Replication service stops the replication that occurred because you had a DFSR jet
database that was not shutdown cleanly, and Auto-recovery was disabled. And then it would
go on to tell you that to resolve this issue, backup the files in the effected replicated folders.
Then use the resume replication Windows Management Instrumentation, or WMI, method to
resume replication.

Now the event error ID would al...2213 would also identify the globally unique identifier, or
GUID, for the database that was affected. And so you get the GUID and then you would
actually have the steps in the recovery process. And it would actually give you the command
line that you have to run, right? So great event ID. Now the beauty, of course, in 2012 R2 is
that we don't have any of that. Auto-recovery is enabled; you see the registry key there. And
that registry key is set so that Auto-recovery is always enabled. And we get an automatic
restart on the database; no administrator intervention.

4. Demo: File restoration


New in Windows Server 2012 R2 is the pre-existing files: Manifest.xml and the
ConflictAndDeletedManifest.xml, which allow us to restore files that have been accidentally
deleted or that were edited in a conflicting way. We can pick and choose what files we restore,
right? Or we can restore them all in mass. Now if I take a look in here guys, you can see here
is that REPL directory and you can see we are on Server3. So this is replicated from that clone
database that we did before. And if we take a look here, here is this file...and we see we have
all these NLS files, you know, these are font files, right? Let us assume that the function of this
directory is to distribute: software updates, patches, image files, right? Things that we need
around the office drivers. And in this example, fonts – and that is what these NLS files are.

[The REPL folder is open and the following font files are displayed in the partially displayed
page: Testfile_(75).NLS Testfile_(74).NLS Testfile_(73).NLS Testfile_(72).NLS Testfile_
(71).NLS Testfile_(70).NLS Testfile_(69).NLS Testfile_(68).NLS Testfile_(67).NLS Testfile_
(66).NLS Testfile_(65).NLS Testfile_(64).NLS Testfile_(63).NLS Testfile_(62).NLS Testfile_
(61).NLS Testfile_(47).NLS Testfile_(46).NLS Testfile_(45).NLS Testfile_(44).NLS Testfile_
(43).NLS Testfile_(42).NLS Testfile_(41).NLS Testfile_(40).NLS Testfile_(39).NLS Testfile_
(38).NLS Testfile_(37).NLS Testfile_(36).NLS Testfile_(35).NLS Testfile_(34).NLS]

And if I look here I can see, wait a minute, here is 47 and here is 61. Well what happened to all
the ones in between? Well we can drill down here, right? If I look in my replication directory, I
have this DfsrPrivate. Now this is a hidden folder normally, right? And you guys know how to
make these things appear, right? I go to View, to Options, to View, and then over here – I say
show hidden files and folders. So if you don't see it – that is why. Now if I look in the
ConflictAndDeleted, well wait a minute, here they are, right? There is 48, 49, 50, 51 there they
all are, right, just like I would expect. Now I want to get them out of there. I want to restore
them, which is something we could not do in previous versions.

So here is what I am going to do. I am going to come over to the PowerShell Integrated
Scripting Environment, or ISE, and this is a graphical interface, right guys? If I look over here, I
see that here is all the DFSR commands, right? I can specify just the DFSR commands. I see
that there is this command Get-DfsrPreservedFiles and I can build it out with the
required parameter values from the help that I get right down here.
[The REPL folder is open and the shortcut for a folder DfsrPrivate and the following files are
displayed: DfsrPrivate Testfile_(2).xsl Testfile_(1).xsl Testfile_(1).xml Testfile_(2).vbs Testfile_
(1).vbs Testfile_(1).tsp Testfile_(2).tib Testfile_(100).nls Testfile_(99).NLS Testfile_(98).NLS
Testfile_(97).NLS Testfile_(96).NLS Testfile_(95).NLS Testfile_(94).NLS Testfile_(93).NLS
Testfile_(92).NLS Testfile_(91).NLS Testfile_(90).NLS Testfile_(89).NLS Testfile_(88).NLS
Testfile_(86).NLS Testfile_(85).NLS Testfile_(84).NLS Testfile_(83).NLS Testfile_(82).NLS
Testfile_(81).NLS Testfile_(80).NLS Testfile_(79).NLS Testfile_(78).NLS Testfile_(77).NLS
Testfile_(76).NLS Testfile_(75).NLS Testfile_(74).NLS Testfile_(73).NLS Testfile_(72).NLS
Testfile_(71).NLS Testfile_(70).NLS Testfile_(69).NLS Testfile_(68).NLS Testfile_(67).NLS
Testfile_(66).NLS Testfile_(65).NLS Testfile_(64).NLS Testfile_(63).NLS Testfile_(62).NLS
Testfile_(61).NLS Testfile_(47).NLS Testfile_(46).NLS Testfile_(45).NLS Testfile_(44).NLS
Testfile_(43).NLS Testfile_(42).NLS Testfile_(41).NLS Testfile_(40).NLS Testfile_(39).NLS
Testfile_(38).NLS Testfile_(37).NLS Testfile_(36).NLS Testfile_(35).NLS Testfile_(34).NLS
The instructor clicks the View menu from the ribbon and clicks the Options button. As a result,
the Folder Options dialog box is displayed. The instructor clicks the View, which includes two
options as follows: Don't show hidden files, folders, or devices and Show hidden files, folders,
and drives. Next the instructor clicks Cancel to close the dialog box and navigates back to the
REPL folder. Next the instructor double-clicks the shortcut for the DfsrPrivate folder. The
Dfsrprivate includes folders - ConflictAndDeleted, Deleted, Installing,PreExisting, Staging, and
ConflictAnddeletedManifest.xml file. Next the instructor double-clicks the ConflictAndDeleted
folder, which includes the following files: Testfile_(48) Testfile_(49) Testfile_(50) Testfile_(51)
Testfile_(52) Testfile_(53) Testfile_(54) Testfile_(55) Testfile_(56) Testfile_(57) Testfile_(58)
Testfile_(59) Testfile_(60) The instructor now navigates back to the PowerShell ISE window.
The Commands pane of the window includes a modules drop-down list box, a Name field, and
a Parameters section.]

Now if you take a look, I have already entered the first cmdlet here, Get-
DfsrPreservedFiles and here we specify the -Path. Now the path is not what you
might think, right? Normally you would think you would just direct it to the directory path where
the folders resided, but what we do here and we do that, right? You can see
C:\Share\REPL\DfsrPrivate\ConflictAndDeletedManifest.xml. We are
going to call the Manifest.xml file for ConflictAndDeleted to display the list of files that have
been deleted or are in conflict, and that is what gets put in this directory, right, conflicts and
deleted. And there I see them, right? Just like I would expect there is 56, 53, 60 they are all in
there, right? So here is what I am going to do. We are going to run this cmdlet here right now.
This one is a little bit longer and again, right, I could build it with the help from the PowerShell
ISC over here. Here I see, there is the command Restore-DfsrPreservedFiles. And
where do I want to restore them? I want to restore them to their point of origin, right? I want to
Force that restoration, I want to copy the files if AllowClobber...if the files actually exist in the
directory with that name, clobber them, right? Replace them with the ones that are coming out
of the content of the preserved content directory.

[The PowerShell ISE window is open and the following cmdlet is displayed: Get-
DfsrPreervedFiles –path C:/Share/REPL/Dfsrprivat\ConflictAndDeletedManifest.xml The
instructor executes the code and details of each NLS files are displayed. Next the instructor
enters the following cmdlet: Restore-Dfsrpreservedfiles –path
C:/Share/REPL/Dfsrprivat\ConflictAndDeletedManifest.xml -RestoreToOrigin –AllowClobber -
copyfiles -Force The instructor now navigates to the Commands pane of the PowerShell ISE
and selects Restore-DfsrPreservedfiles from the drop-down. As a result, two tabs are
displayed in the Parameters section: RestorePath and RestoreOrigin. The RestoreOrigin tab is
already selected and includes checkboxes as follows: RestoreToOrigin, AllowClobber,
Confirm, Copyfiles, Force, RestoreAllversions, and Whatif. The RestoreToOrigin,
AllowClobber, CopyFiles, and Force checkboxes are already checked.]

Now take a look at the command line here, right, Restore-DfsrPreservedFiles and I
see the path there. But now look we don't call just the path we call the
preexistingManifest.xml, we use the -RestoreToOrigin switch. So it puts it
back to where they belong, -AllowClobber -CopyFiles -Force, right? So we go
ahead and we run that and when we come back over here, now we will see these disappear
and the files returned. Just a second, let me refresh it here and bang! Just like that they
disappear before our eyes. And if we come back over to REPL here, right, where we wanted
them to restore it, right? Let us just sort these by Type, as we get all our NLS files together
again. And what we should see now is...that there they are, right? We were missing 47 to 60
before and we see that now, 47 to 60 have been restored. And those pre-existing or preserved
file widths have been stripped and the original filenames have been restored.

[The PowerShell ISE window is open and the following cmdlet is displayed: Restore-
DfsrPreservedfiles –path C:/Share/REPL/Dfsrprivat\ConflictAndDeletedManifest.xml -
RestoreToOrigin –AllowClobber -CopyFiles –Force The instructor executes the command and
navigates back to the ComflictAndDeleted folder. Next the instructor navigates to the REPL
folder and clicks Type to sort the NLS files by their Type.]
The File Server Resource Manager in Server
2012 R2
Learning Objective
After completing this topic, you should be able to
◾ decide how to use File Server Resource Manager to accomplish a task

1. The FSRM role


Consistently year over year, you are being asked to manage and maintain 40% more data than
you did the year before. And that is on an average across the board; that is what we are all
looking at. We are all being asked to store and manage more data longer, especially if we fall
under the auspices of particular regulatory compliance. You know I think about folks, maybe,
who work for the state of California, right? You are an elected official, you are working for the
state of California – all your constituence e-mail must be kept for seven years, right? So if you
are the IT guy there, you got to figure out how you are going to manage that.

So we are coming under more and more pressure to do that. One of the resources that we
have available to us to manage our file servers is the File Server Resource Manager, or
FSRM, in Windows Server 2012 R2. Now if you have seen this thing and previous iterations of
the operating system, it has really not changed much here, right? We can still do the things
here that we would expect to be able to do – I can configure notifications, I can classify files, I
can manage quotas, right? I want to say you get 2 GB of space and that is all you get. So let's
take a look in here at what we can do with the File Server Resource Manager today.

[The File Server Resource Manager includes the following nodes and subnodes in the
navigation pane: Quota Management Quotas Quota Templates File Screening Management
File Screens File Screen Templates File Groups Storage Reports Management Classification
Management Classification Properties Classification Rules File Management Tasks]

Now probably everybody here is familiar with the disk quotas, right? Because we have had
those in Windows forever, you know, you can always go in and right-click on a disk, go into the
Properties and then there is that quotas tab. And you could enable quotas and I could give
you say 2 GB of storage on that disk. Now the problem with that is that it is volume specific.
There is no granularity, right? I can't specify in a particular folder, for example, that you have
an upper limit. That is where FSRM quotas come in. When I install the File Server Resource
Manager on my file server, I can create granular folder level quotas, right, very different from
disk quotas. And so we don't use this in combination. And in fact my FSRM quotas will always
override any disk quotas that have been applied.

Quotas come in two types: hard and soft, right? I give you a 2 GB limit, when you hit the limit
what happens? You are denied any further storage – that is a hard quota, right? If you just get
warned but you are still permitted to save over the quota limit – that is a soft quota. And when
we think about quota limits, we think about the business. What is important to the business?
Are we so slim on storage that we have to hard force that quota? Or is it more important to the
flow of the business that users be able to save their business related documents even when
they pass the quota.

Now FSRM is a Microsoft Management Console and so, of course, it comes with templates –
Quota Templates, right? I have got this predefined templates that I can apply to a folder, to a
directory, whatever it is I am managing with FSRM. And I can say, okay, you get a 500 MB
limit and that is your limit, right? We apply that template. That is the template they get supplied.
I can create my own templates and I can manage those templates, you know, copy those
templates, apply them to my own folder structures etc.

I can control the notifications in there, right? Does the user get an e-mail warning them? Does
the admin get an e-mail warning them, when they are getting close to the quota limit? Do the
admin and the user get an e-mail when they are actually beyond the quota limit? What
happens, do we log an event in the Event Viewer? Do I run a program, maybe, to look for old
files that might be able to be cleaned out or moved off to cheaper storage? These are the
things that I can do with templates.

[The File Server Resource Manager includes the following nodes and subnodes in the
navigation pane: Quota Management Quotas Quota Templates File Screening Management
File Screens File Screen Templates File Groups Storage Reports Management Classification
Management Classification Properties Classification Rules File Management Tasks The Quota
Template subnode is selected and contains the following details in the view pane: Quota
Template 100 MB Limit 200 MB Limit Reports to User 200 MB Limit with 50 MB Extension 250
MB extended Limit IT Depts. 200 MB Template Monitor 200 GB Volume Usage Monitor 500
MB Share Limit 100 MB 200 MB 200 MB 250 MB 200 MB 200 MB 500 MB]

When we think about our file servers, one of the things that we think about is the nature of the
documents that are going to be stored there. You know, I think about my marketing department
or my graphics department; I can profile the files by type that they create over there. They
create word documents, JPEGs, mp4s whatever, right? But I know what it is that they create.
And we can use files screens to prevent users from saving files of type that are outside the
scope of what should be stored on the file server. So particularly, we think about malicious
software, we think about preventing vri and worms from getting on our file servers and causing
damage.

So one of the things that we do is, maybe, we screen out all the usual suspects, right? We get
a file group for executables that is going to include the exes, the .paths, .cmds, all the
command line files. Everything that some hacker might package illicit code up here. And we
exclude those or maybe we include those, right? Whatever it is that we want to allow, we
create a file group to include. And whatever it is we want to disallow, we create a file group to
exclude. And, of course, it is Microsoft, so there are file groups, built-in video files, picture files,
right...groups that I can apply these screens out of the box.

More and more we are challenged by the need to preserve personally identifying information. I
want to secure that information. I want to secure it from users who should not have access to
it, from users from the outside world, right? But how do I know what is in all these files that are
getting created on my file servers? Well one of the things I can do is create file classification
rules. Now we can actually use logic to do pattern matching So maybe I am concerned about
social security numbers or I am concerned about credit card numbers or bank account
numbers. I know what those numbers look like. I can create file classification rules that scan
the documents looking for number patterns that match security cards or credit cards, and then
I can lock those files out from everybody except the folks who should have access to that
sensitive data. That is what file classification rules do for me.

[The General tabbed page of the Create Local Classification Property dialog box contains two
fields – Name and Description, and a Property type section.]

This is one of the greatest things about the File Server Resource Manager – file management
task. Now guys think about this with me. There is a folder, right? We will call that the scope.
We will identify that as the scope for this file management task. There is a folder. I don't think
there should be anything in that folder over 365 days old. So on the Condition tab, I can
specify the usage information, right? If it has not been accessed in 365 days, then I want to do
something about that. I want to get it off there. I want to expire that file.

Now I can do things like I can attach a task to this, so that maybe we copy those expired files
off the server and on to cheaper storage some place. I can configure notification, so that the
users that are going to be impacted by this get notified, say by an e-mail, right? If I have got
Active Directory Rights Management Services, or AD RMS, I can even apply templates, right?
AD RMS templates to lock these files down based on file classification rules. We identified a
social security number in there apply in AD RMS template.

[The Create File Management Task dialog box includes seven tabs as follows: General,
Scope, Action, Notification, Report, Condition, and Schedule. The General tab is selected and
the General tabbed page contains two fields as follows: Task name and Description.]

One of the things that has frustrated all of us for a long time is a lack of good reporting
mechanisms on our file servers. And...you know, you can always do a search for large files,
right? Oh! I am running out of space, I do a search for large files, I get rid of all those old wim
files, right? Or I get rid of, you know, whatever it happens to be into your business, right? But I
get...I find those big files and I purge them out fast. Well today with storage reports
management, we can do searches for duplicate files; I can search by a particular user; I can
search for large files, of course, right, most recently accessed, least recently accessed; and
quota use. I can query everything on that box. And whatever it is I am looking for, whatever it is
I need, I should...for most things I can pull it right out with these storage reports.

[The Storage Report Task Properties dialog box contains four tabs as follows: Settings, Scope,
Delivery, and Schedule. The Settings tab is selected and the Settings tabbed page includes a
Report Name field and two sections - Report Data and Report formats. The Report data
section includes the Select reports to generate drop-down list box, which is partly displayed
and includes the following options: Duplicate Files File Screening Audit Files by File Group
Files by Owner Files by Property Folders by Property Large Files The Duplicate Files, File
Screening Audit, Files by File Group, Files by Owner, and Large Files options are checked.
The Report formats section includes options as follows: DHTML, HTML, XML, CSV, and Text.
The DHTML option is checked.]

2. Demo: Installing the FSRM role


Since Windows Server 2008, for our file servers we have had this great thing called the File
Server Resource Manager. And we are going to go ahead and we are going to install that File
Server Resource Manager here, right? We are going to do it in the GUI but, of course, I could
use the Install-WindowsFeature cmdlet, right? There it is File Server Resource
Manager, I will accept the additional management tools that it wants to install. And there is no
additional features to install this time; we will go ahead and we will install this thing. And the
beauty of the File Server Resource Manager is the manageability options that it gives us.
When we think about maintaining data on the servers, limiting users ability to store files on the
server – whether that is because of the file type or because of an extend quota, right, we don't
want anybody to store more than 2 GB of data on this box that is all we have, right?

[The Server Manager Dashboard window is open. The toolbar of the window contains menus
as follows: Manage, Tools, View, and Help. The instructor clicks Manage and selects Add
Roles and Features from the drop-down and the Add Roles and Features Wizard is displayed
and the instructor clicks Next. The Installation Type page of the wizard is displayed and the
instructor clicks Next. As a result, the Server Selection page of the wizard is displayed and the
instructor clicks Next. The Server Roles page of the wizard is displayed. The page contains a
Roles section, which includes the following options: Active Directory Certificate Services Active
Directory Domain Services Active Directory Federation Services Active Directory lightweight
Directory Services Active Directory Rights Management Services Application Server DHCP
Server DNS Server Fax Server File and Storage Services File and iCSI Services File Server
BranchCache for Network Files Data Deduplication DFS Namespace DFS Replication File
Server Resource Manager File Server VSS Agent Service iSCSI Target Server iSCSI Target
Storage Provider (VDS and VSS) Server for NFS Work folders Storage Services Hyper-V
Network Policy and Access Services The instructor clicks the File Server Resource Manager
optionandthe Add Roles and Features Wizard dialog box is displayed, which contains an Add
Features button. Next the instructor clicks the Add Features button and navigates back to the
Server Roles page of the Add Roles and Features Wizard, and clicks Next. The Features page
of the wizard is displayed and the instructor clicks Next. As a result, the Confirmation page of
the wizard is displayed, Which includes an Install button. The instructor clicks the Install button
and as a result the Results page of the wizard along with Feature installation progress bar is
displayed.]

That is the kind of manageability of my file servers that I get with FSRM. So here we can see
the installation proceeding. And the reality of FSRM, I think is that it really does make the
management of our file servers far easier. When we think about cleaning out old files,
automating the process by which that happens, we think about controlling user access to the
shares – we get a lot of manageability here in FSRM. So you can see, we installed the role;
now the tool is available to us and we will take a look at this in future demos.

[The Results page of the Add Roles and Features Wizard is open along with the Feature
installation progress bar. The instructor clicks the Close button to close the wizard and
navigates to the Server Manager Dashboard. Next the instructor clicks Tools and selects File
Server Resource Manager from the drop-down menu.]
Configuring FSRM in Windows Server 2012 R2
Learning Objectives
After completing this topic, you should be able to
◾ describe how to configure an FSRM quota template in a given scenario
◾ identify the steps to create a file screen template

1. Demo: FSRM quotas


Now folks probably everybody here is familiar with disk quotas, right? And this is the way we
used to do things. We have this thing called disk quotas in the past and the idea here was on
my file servers, I wanted to limit the amount of disk space that any one user could consume,
right? The problem with this is that they are not at all granular, the quotas supplied here apply
to the entire disk. What we want is a more granular management solution and that is what we
get in File Server Resource Manager, or FSRM.

So if we come on over here, here we are in the File Server Resources Manager dialog box.
And we can see in here Quota Management and down here I have got Quotas and you can
see I have one existing quota here and then I have these Quota Templates. And these Quota
Templates are predefined limits and warnings based on what the user can do and again that is
based on the user limit, right? So in the example here any user that has rights to this share has
a limit of a 100 MB. That is how much data they can store in this directory, right?

[Windows Explorer is open, and in the navigation pane, This PC is selected, whose contents
are displayed in the view pane. This PC includes the Local Disk (C:) and DVD Drive (D:). The
instructor right-clicks Local Disk (C:) and from the shortcut menu, clicks Properties, which
launches the Local Disk (C:) Properties dialog box. This dialog box includes nine tabs:
General, Tools, Hardware, Sharing, Security, Shadow Coples, Previous Versions, Quota, and
Classification, from which the General tab is open. The instructor clicks the Quota tab, which
displays the message "Status: Disk quotas are disabled." The instructor closes the dialog box
and navigates to the File Server Resource Manager (Local) Console, in which the navigation
pane displays the following nodes: Quota management, File Screening Management, Storage
Reports Management, Classification Management, and File Management Tasks. The Quota
Management node is already selected, which the instructor expands and the nodes Quotas
and Quota Templates are displayed. The instructor selects the Quota Templates node and a
list of Quota Templates is displayed in a table in the view pane. The table includes four
columns: Quota Template, Limit, Quota Type, and Description. The Quota Template column
displays seven quota templates, from which the instructor right-clicks 100 MB Limit and from
the shortcut menu, clicks Edit Template Properties. As a result, the Quota Template Properties
for 100 MB Limit dialog box is displayed. This dialog box includes the "Copy properties from
quota template (optional)" drop-down list, from which 100 MB Limit is already selected. The
dialog box also includes the Settings section, which consists of the Template name filed and
Description (optional) field. In the Template name field, the value 100 MB Limit is already
entered. Within the Settings section, the dialog box includes the Space limit section, in which
the Limit field displays the value 100.000 and its drop-down list displays the option MB. The
Space limit section includes the options "Hard quota: Do not allow users to exceed limit" and
"Soft quota: Allow users to exceed limit (use for monitoring)" from which "Hard quota: Do not
allow users to exceed limit" is already selected. The Settings section also includes Notification
thresholds, which consists of five columns: Threshold, E-mail, Event Log, Command, and
Report. The Threshold column includes three entries: Warning (85%), for which only E-mail is
ticked; Warning (95%), for which E-mail and Event Log are ticked; and Warning (100%), for
which E-mail and Event Log are ticked.]

Now in the example here this is just a template so this just defines some property sets, it
doesn't actually apply anywhere, but this gives you an idea of how you define the properties.
So now in here you can make the quota hard or soft, right. A hard quota meant when they got
a 100 MB of data stored in a directory, that is it. They can't store anymore. Commonly we use
soft quotas. Soft quotas warn the user that they are exceeding the quota threshold, but it lets
them save the data. You know how do we make the decision between these – if the data that
drives our business is more important than the disk space then we use the soft quota. If the
disk space that is available is more critical than the data that drives our business being saved
to a local hard drive temporarily then we enforce those hard quotas.

And then down here, we can have these threshold notifications. When they hit 85% of their
quota warn them with an e-mail. When they hit 95% of their quota warn them with an e-mail
again, and log an event log entry. When they are at a 100% of the quota warn them in the
e-mail, log it in the event log, and maybe I write a script. See this Command choice, I could tie
a script to this. And that script, maybe, could run and it could purge files of a certain age or a
certain size, you know, whatever was important for me to clean up that disk.

[The File Server Resource Manager (Local) Console is open and the Quota Template
Properties for 100 MB Limit dialog box is displayed. This dialog box includes the "Copy
properties from quota template (optional)" drop-down list, from which 100 MB Limit is already
selected. The dialog box also includes the Settings section, which consists of the Template
name filed and Description (optional) field. In the Template name field, the value 100 MB Limit
is already entered. Within the Settings section, the dialog box includes the Space limit section,
in which the Limit field displays the value 100.000 and its drop-down list displays the option
MB. The Space limit section includes the options "Hard quota: Do not allow users to exceed
limit" and "Soft quota: Allow users to exceed limit (use for monitoring)" from which "Hard quota:
Do not allow users to exceed limit" is already selected. The Settings section also includes
Notification thresholds, which consists of five columns: Threshold, E-mail, Event Log,
Command, and Report. The Threshold column includes three entries: Warning (85%), for
which only E-mail is ticked; Warning (95%), for which E-mail and Event Log are ticked; and
Warning (100%), for which E-mail and Event Log are ticked.]

These templates can be used...now you can see we can edit these templates, I could change
the type of quota in here if I wanted, right, I could Add new notification thresholds etc. right?
But we are not worried about that we are worried about is creating our own quotas. And we
can create those our own quotas based either on a template or we can define our own
property sets. So for example, I have this folder called Marketing. This is a share, it is
actually...it is a Distributed File System, or DFS, share. It has replicated as part of DFS and it is
available to marketing users across the organization. They use it to save files for the graphics
guys. That is what it is for. So we can give this a limit based on an existing template and if I
look here I see the biggest one that I have out of the box is the 1 GB hard limit. That is not
enough because they are using these graphics files...these big graphics files. And so I want to
give everybody that uses this share, you know, at least a 4 GB limit, and this is the Graphix
Staging folder.

[The File Server Resource Manager (Local) Console is open and the Quota Template
Properties for 100 MB Limit dialog box is displayed. In the Space limit section, the instructor
selects the option "Soft quota: Allow users to exceed limit (use for monitoring)" and clicks
Cancel. As a result, the dialog box closes and the File Server Resource Manager (Local)
Console is displayed. In the navigation pane, the instructor selects Quotas, then right-clicks
Quotas, and from the shortcut menu, clicks Create Quota, which launches the Create Quota
dialog box. In this dialog box, the instructor clicks Browse for Quota path and the Browse For
Folder dialog box is displayed, from which the instructor expands Local Disk (C:), then
expands Share folder, selects the Marketing folder, and clicks OK. As a result, the Browse For
Folder dialog box closes and the Create Quota dialog box is displayed, in which the path
C:\Share\Marketing is displayed in the Quota path field. The Create Quota dialog box includes
the options "Create quota path" and "Auto apply and create quotas on existing and new
subfolders," from which "Create quota on path" is already selected. The dialog box also
includes the Quota properties section, which displays two options to configure quota
properties, The first option is "Derive properties from this quota template (recommended),"
which includes the drop-down list that displays the value 100 MB Limit. The second option to
configure quota properties is "Define custom quota properties," which includes the button
Custom Properties that is disabled. From these options, the "Derive properties from this quota
template (recommended)" is already selected. The instructor opens the drop-down list for
setting the size limit and clicks 1GB Hard Limit Template. Then the instructor selects the option
"Define custom quota properties" and the Custom Properties button is enabled. The instructor
clicks Custom Properties and the Create Quota dialog box is displayed, in which, within the
Space limit section, the instructor opens the measurement unit drop-down list, clicks GB, and
enters the value 4 in the Limit field. Then the instructor clicks Add and the Add.]

Down here we can Add notifications. I want to send an e-mail to the administrator when this
happens or I send an e-mail to the user, and I can define a custom e-mail that explains to the
user: what is going to happen, where they are at etc. Now for this to work guys, you have to
have a Simple Mail Transfer Protocol, or SMTP, virtual server installed and configured on this
machine. If you don't, right, you will get a warning that says that, you know, you don't have
that. I can send a warning to the event log; again I can write a script that runs. And here I can
generate reports based on Duplicate Files, a File Screening Audit, files by group or by owner,
right, by user, Quota Usage. And these reports will give me real time insight into how this file
server is being used. When I go to save my quota I am prompted to "Do I want to save the
quota as a template?" If we use lots of big files then maybe. I will call this the 4 GB Hard
Limit template and I will save it as a template.

[The File Server Resource Manager (Local) Console is open and the Add Threshold dialog box
is displayed. This dialog box includes four tabs: E-mail Message, Event Log, Command, and
Report. From these tabs, the E-mail Message tab is already selected, which includes the
"Send e-mail to the following administrators" checkbox with an associated field, and the "Send
e-mail to the user who exceeded the threshold" checkbox. The E-mail Message tab also
includes the Message body field, which displays a message. The instructor selects the "Send
e-mail to the user who exceeded the threshold" checkbox and then clicks the Event Log tab,
which includes the "Send warning to event log" checkbox. The instructor selects the "Send
warning to event log" checkbox and clicks the Command tab and then clicks the Report tab,
which includes the "Generate reports" checkbox and the Select reports to generate list box that
consists of Duplicate Files, File Screening Audit, Files by File Group, Files by Owner, Files by
Property, Large Files, Least Recently Accessed Files, Most Recently Accesses Files, and
Quota Usage checkboxes. Then instructor clicks OK, the dialog box closes, and the Create
Quota dialog box is displayed. In the Create Quota dialog box, the instructor clicks Create and
the Save Custom Properties as a Template dialog box is displayed, which allows you to save a
quota template. This dialog box includes the option "Save the custom properties as a
template," which is already selected, the "Template name" field, and the option "Save the
custom quota without creating a template." In the Template name field, the instructor enters 4
GB Hard Limit and clicks OK. The Save Custom Properties as a Template dialog box closes,
the Create Quota dialog box closes, and the File Server Resource Manager (Local) Console is
displayed.]

And now when I look down here in Quota Templates, I see here is my 4 GB Hard Limit quota.
Oh...and look when I saved it as a template it saved the description. So I want to come in here
and edit that, but I don't really need that. Now you will notice when I edit the existing template, I
am prompted – apply the template only to derive quotas that match the original template, Apply
template to all derived quotas, do not apply the template to derived quotas. So if I make a
change to one of my templates, I can forcibly apply those changes to all of the quotas that
were derived from the original template, right? I can essentially update them all. If I don't want
to update the existing quota applications, I say do not apply. And that is a look at Quotas and
Quota Templates, Quota Management in FSRM.

[The File Server Resource Manager (Local) Console is open and in the view pane the quota
paths are displayed. In the navigation pane, the instructor selects the Quota Templates node
and the quota templates are displayed in the view pane. In the Quota Template column, the 4
GB Hard Limit quota template is displayed, for which the Limit is 4.00 GB, Quota Type is Hard,
and Description is Graphix Staging. The instructor right-clicks 4 GB Hard Limit quota template
and from the shortcut menu clicks Edit template Properties, which launches the Quota
Template Properties for 4 GB Hard limit dialog box. This dialog box includes Template name
field, in which 4 GB Hard Limit is displayed, and Description (optional) field, in which Graphix
Staging is displayed. The instructor Graphix Staging from the Description (optional) field and
clicks and clicks OK, which displays the Update Quotas Derived from Template dialog box.
This dialog box includes three options to apply the template changes to existing quotas: Apply
template only to derived quotas that match the original template, Apply template to all derived
quotas, and Do not apply template to derived quotas. From these options, the option "Apply
template only to derived quotas that match the original template" is already selected. The
instructor selects the option "Do not apply template to derived quotas" and clicks OK. As a
result, the dialog box closes and the File Server Resource Manager (Local) Console is
displayed.]

2. Demo: FSRM file screens


Here we are in the FSRM Management Console and I can see here this thing called File
Screening Management. And if I look there are File Screens, Files Screen Templates, and File
Groups. Now the first thing to understand guys is the File Groups, and what you have here are
the usual suspects organized by file type. And so...now guys think about this with me, right?
When we think about threats to our machines, right, my user stored data on this file server,
maybe they pull that data down from the public Internet. I want to know that nobody is
accidentally or intentionally, maliciously saving malicious content onto this server where there
should only be data files. And so we group files by type. Who are the usual suspects when we
think of vri and worms, right? We think about executables, we think about .bat, .cmds, .coms,
right? Those are the files that we think about – installer files, package files.

[File Server Resource Manager (Local) Console is open and in the navigation pane, the
following nodes are displayed: Quota Management, File Screening Management, Storage
Reports Management, Classification Management, and File Management Tasks. From these
nodes, Quota Management is already selected. The instructor expands the File Screening
Management node and the following nodes are displayed: File Screens, File Screen
Templates, and File Groups. From these nodes, the instructor selects the File Groups node
and three columns, File Groups, Include Files, and Exclude Files, are displayed in the view
pane. The File Groups column includes the following entries: Audio and Video Files, Backup
Files, Compressed Files, E-mail Files, Executable Files, Image Files, Office Files, System
Files, Temporary Files, Text Files, and Web Page Files. In the Include Files column, for
Executable Files, the following file types are included: *.bat, *.cmd, *.com, *.cpl, *.exe, *.inf,
*.js, *.jse, *.msh, *.msi, *.msp, *.ocx, and *.pif. In the Include Files column, for Image Files, the
following file types are included: *.bmp, *.dib, *.eps, *.gif, *.img, *.jfif, *.jpe, *.jpeg, *.pcx, *.png,
*.ps, and *.psd.]

I can eliminate executable files from my data-drives with file screens and that is one example,
you know, you think about image files and here they are...not talking about server images,
right, or client images, OS images – we are not talking about that. We are talking about picture
files, right? I will tell you I used to do some work in Jersey City and at our place there, there
was a kid...and he was a kid too. We found 50 GB of his mp3 files on one of our file servers,
right? Now if we had used a file screen to screen out audio and visual files, right? That kid
would not have been able to save that 50 GB of his personal data on our corporate network
nor would he have then been able to share it out with his 5,000 closest friends on Napster,
right, using our bandwidth, Napster, right? That tells you how old that story is. But this thing
solves those problems that is what we wanted for.

[File Server Resource Manager (Local) Console is open and the File Groups page is displayed
in the view pane. In this page, three columns, File Groups, Include Files, and Exclude Files,
are displayed. In the Include Files column, for Audio and Video Files, the following file types
are included: *.aac, *.aif, *.aiff, *.asf, *.asx, *.au, *.flac, *.m3u, *.mid, *.midi, *.mp1, *.mp2,
*.mp3, *.mp4, *.mpa, *.mpe, *.mpeg, *.mpeg2, *.mpeg3, *.mpg, *.ogg, *.qt, *.swf, *.vob, *.wav,
*.wma, *.wmv, and *.wvx.]

Now there are these file screen templates, right, and I can apply these to shares, right, block
audio and video files, e-mail files etc. And so these are pre-existing collections of File Groups
that I can apply to a shared directory. And I can create my own file screens, right? If I come in
here and say Create File Screen, we can specify the path for this file screen. And here I am
going to show you I have got this Share and one of the things that we do here is...we image
our machines, right? Now here when I use the word images I am talking about operating
system images. And those operating system images...as a matter of fact I am just going to do
this, go ahead and we will rename this to OS_Images. And those operating system images,
they are always of the wim type – the wim file format. And nothing else should go in there, so I
save that. Now here are the templates, audio and video files. Certainly there should be none of
those here – executables, image, right, none of these should be there.

[File Server Resource Manager (Local) Console is open and the File Groups page is
displayed. In the navigation pane, the instructor clicks File Screen Templates and three
columns, File screen Template, Screening Type, and File Groups, are displayed. The File
screen Template column includes five templates, which are Block Audio and Video Files, Block
E-mail Files, Block Executable Files, Block Image Files, and Monitor Executable and System
Files. Then in the navigation pane, the instructor right-clicks File Screens node and from the
shortcut menu, clicks Create File Screen, which launches the Create File Screen dialog box.
This dialog box includes the File screen path files with a Browse button. The instructor clicks
Browse an the Browse For Folder dialog box is displayed. In this dialog box, the instructor
expands Local Disk (C:), then expands the Share folder, and selects the Images folder. Next
the instructor renames Images to OS_Images, then clicks OK, the Browse For Folder dialog
box closes, and the Create File Screen dialog box is displayed. This dialog box displays two
options to configure file screen properties; first option is "Derive properties from this file screen
template (recommended)" and the second option "Define custom file screen properties." From
these options, the option "Derive properties from this file screen template (recommended)" is
already selected and includes a drop-down list from which Block Audio and Video Files is
already selected. The instructor opens the drop-down list and the following options are
displayed: Block Audio and Video Files, Block Executable Files, Block Image Files, Block
E-mail Files, and Monitor Executable and System Files.]

So down here I could say define custom properties, we can come in here and now here are the
templates. Look at...there should not be any audio video files, right? We may need some
Compressed Files there, but no E-mail Files, no executables, no Image Files, no Office Files –
these are wim files, Text Files, Web Page Files, I don't want any of those there and I want
active screening. I do not allow users to save unauthorized files to this share. Passive
screening warns them but it lets them save them anyway, right?

Now over here we can configure notifications, right? Do they get e-mailed when there is a
problem? Do we log in event to the Event Viewer? Do we run a command line? And of course,
we can generate storage reports from here too based on this usage. And here again we are
prompted, do we want to save this custom properties as a template. Well this is a very
exclusive, kind of, thing; we are not really going to need this much more, right? So we will just
save that again above. Now we can see that this has been created, I can edit the file screen
properties here or I could create the template from this file screen if I decided that I needed to
do that. So guys this is a look at File Screen Management in Windows Server 2012 R2.

[File Server Resource Manager (Local) Console is open and the Create File Screen dialog box
is displayed. In this dialog box, the instructor selects the option "Define custom file screen
properties" and its associated Custom Properties button becomes enabled. Then the instructor
clicks Custom Properties and the File Screen Properties on C:\Share\OS_Images dialog box is
displayed, which includes the tabs: Settings, E-mail Message, Event Log, Command, and
Report. From these tabs, the Settings tab is already selected, which includes the "Select file
groups to block" list box. This list box includes the following checkboxes: Audio and Video
Files, backup Files, Compressed Files, E-mail Files, Executable Files, Image Files, Office
Files, System Files, Temporary File, Text Files, and Web Page Files. From these checkboxes,
the instructor selects Audio and Video Files, E-mail Files, Executable Files, Image Files, Office
Files, Text Files, and Web Page Files. The Settings tab also includes the following two options
for screening type: "Active screening: Do not allow users to save unauthorized files" and
"Passive screening: Allow users to save files (use for monitoring)." From these options, the
option "Active screening: Do not allow users to save unauthorized files" is already selected.
The instructor clicks the E-mail Message tab, then clicks the Event Log tab, and then clicks the
Report tab. Then the instructor clicks the Settings tab again and clicks OK. As a result, the File
Screen Properties on C:\Share\OS_Images dialog box closes and the Create File Screen
dialog box is displayed. In this dialog box, the instructor clicks Create and the Save Custom
Properties as a Template dialog box is displayed, which includes two options to save s file
screen template. The first option is "Save the custom properties as a template" with a
Template name field, and the second option is "Save the custom file screen without creating a
template." From these options, "Save the custom properties as a template is already selected.
The dialog box also includes the "Do not ask me to save as a template" checkbox. The
instructor selects the option "Save the custom file screen without creating a template," selects
the "Do not ask me to save as a template" checkbox, and clicks OK. As a result, the Save
Custom Properties as a Template dialog box closes, the Create File Screen dialog box also
closes, and the File Server Resource Manager (Local) Console is displayed, in which, within
the File Screens page, the File Screen Path C:\Share\OS_Images is displayed with the Screen
Type as Active and File Groups as Block: Audio and Video Files. The instructor right-clicks this
File Screen Path and from the shortcut menu clicks Edit File Screen Properties, which
launches the File Screen Properties on C:\Share\OS_Images dialog box. Then the instructor
closes the dialog box and the File Screens page of the File Server Resource Manager (Local)
Console is displayed.]

3. Demo: File classifications


One of the things that many of us are contending with is file server sprawl, right guys? Every
year we are tasked on average with storing 40% more data than the year before. That is an
exponential increase, you know, it blows my mind that for less than a 100 bucks I can buy a
terabyte USB drive today. And so one of the things that we have especially for those of us that
are managed by Sarbanes-Oxley compliance; Kennedy-Kassenbaum Health Insurance
Portability and Accountability Act, or HIPAA compliance; Regulatory compliance; and
Corporate compliance today is file classification. And I want to highlight that this is...this is one
piece of a new kind of dynamic security model from Microsoft, which incorporates both file
classification in a separate piece of technology called discretionary access control, or DAC. So
guys we are going to look at the file classification piece of this today. How do I identify files?
How do I automate the process of identifying files that might be treated differently than other
files? So you know, we look in here maybe I have a need to apply some of these default, kinds
of, properties to files associate them with particular departments rather than necessarily just
the folder in the discretionary access control list on that folder, right? I may want to mark these
as all belong to the marketing department.

[File Server Resource Manager (Local) Console is open and in the navigation pane, the
following nodes are displayed: Quota Management, File Screen Management, Storage
Reports Management, Classification Management, and File Management Tasks. From these
nodes, Classification Management is expanded and the nodes Classification Properties and
Classification Rules are displayed. The instructor clicks Classification Properties and the
following five columns are displayed in the view pane: Name, Scope, Usage, Type, and
Possible values. The Name column includes the following properties: partially visible Access-
Denied Assi..., Department, Folder Owner Email, Folder Usage, and Intellectual Property.
From these properties, the instructor double-clicks Department and the View Global
Classification Property dialog box is displayed, which includes Name field, ID field, and
Description field. The instructor clicks OK, the dialog box closes, and the Classification
Properties page of the File Server Resource Management (Local) Console is displayed.]

One of the things that I can do is I can create my own classifications. So for example, let's say
that we are bound by Sarbanes-Oxley and so the personally identifying information of our
customers is critical to us. And so I want to be able to automate the process by which I search
through client records, and I mark those files as containing PII because the clients' phone
numbers are in there, their social security numbers are in there, their credit card numbers, or
account numbers are in there, right? Whatever it is I am looking for. And I want to mark them,
let's say here is the, kind of, property types we have, right? For our purposes I just want to
identify the files as having personally identifying information. So I can use a Yes/No field. Now
if I had other choices, right? If I wanted to choose from a list, I could create a list of values in
here much like the department list we saw a moment ago, right? I could look for Date-time,
stamp date-time. For our purposes we are going to do Yes/No.

[File Server Resource Management (Local) Console is open and the Classification Properties
page displayed. The instructor right-clicks Classification Properties and from the shortcut menu
clicks Create Local Property, which launches the Create Local Classification Property dialog
box. This dialog box includes the Name field, in which the instructor enters the name PII. The
Property type section of the dialog box also includes a drop-down list from which Yes/No is
already selected. The instructor opens the drop-down list and the following options are
displayed: Yes/No, Date-time, Number, Multiple Choice List, Ordered List, Single Choice,
String, and Multi-string. Then the instructor clicks OK, the dialog box closes, and the
Classification Properties page of the File Server Resource Manager (Local) Console is
displayed.]

Over here we can create the classification rule that leverages this classification property. So I
right-click here and I say Create Classification Rule, I am going to call this PII SOX
Compliance, to what does it apply – User Files, right? Everything here is a user file, it is
created by the users. There are going to be user files. There is no backups stored here they
are handled separately anyway. So I am less concerned about them.

There is how the users get in here. There is a folder called Share that is the share point that
everybody comes into. And there is a directory here called Client Records and that is where all
the client records are stored, right? There is a fault...subdirectory in there for every GB count
wrapped in...his stuff is in there accessible should be only to him. We want to go through there
and we want to automate the process by which this content is classified, right? And you can
see in here we have got the Content Classifier, we got the Folder Classifier same...basically
the same thing but it is just for that folder, and then the Windows PowerShell Classifier. So if
you are writing your own PowerShell statements you can use that. For our purpose, I am going
to do a Content Classifier.

And what we are looking for is the PII property that we created. We want to mark the file as
Yes if certain conditions are met. Now if you look in here, right? I can do a case-sensitive
string or a simple String.
[File Server Resource Management (Local) Console is open and the Classification Properties
page displayed. In the navigation pane, the instructor right-clicks Classification Rules and from
the shortcut menu, clicks Create Classification Rule, which launches the Create Classification
Rule dialog box. This dialog box includes four tabs: General, Scope, Classification, and
Evaluation Type. From these tabs, the General tab is open by default, within which the
instructor enters the name PII SOX Compliance in the Rule name field. Then the instructor
clicks the Scope tab, which allows you to include the folders that store data. The Scope tab
includes the following checkboxes: Application Files, Backup and Archival Files, Group Files,
and User Files. From these checkboxes, the instructor selects User Files and clicks Add, which
launches the Browse For Folder dialog box. In this dialog box, the instructor expands Local
Disk (C:), then expands the Share folder, selects the Client Records folder, and clicks OK. As
a result, the Browse For Folder dialog box closes, and in the Scope tab of the Create
Classification Rule dialog box, in the "The following folders are included in this scope" section,
in the Folder column, the path C:\Share\Client Records is displayed. Then in the Create
Classification Rule dialog box, the instructor clicks the Classification tab, which includes the
"Choose a method to assign a property to files: drop-down list, from which Content Classifier is
already selected. Then the instructor opens the drop-down list and the following options are
displayed: Content Classifier, Folder Classifier, and Windows PowerShell Classifier, from
which the instructor selects Content Classifier. The Classification tab also includes the
Property section, which consists of "Choose a property to assign to files" drop-down list from
which Department is already selected, and the "Specify a value" drop-down list from which
Administration is already selected. The instructor opens the "Choose a property to assign to
files" drop-down list and the following options are displayed: Department, Intellectual Property,
and PII, from which the instructor selects PII. As a result, the value in "Specify a value" drop-
down list is displayed as Yes. The Classification tab also includes the Parameters section,
which includes the Configure button. The instructor clicks Configure and the Classification
Parameters dialog box is displayed, which includes four columns: Expression Type,
Expression, Minimum Occurrences, and Maximum Occurrences. The Expression Type column
includes a drop-down list entry, from which Regular expression is already selected. The
instructor opens this drop-down list and the options Regular expression, String (case-
sensitive), and String are displayed, from which the instructor selects String and another entry
appears in the Expression Type column.]

So for example, I could look us for the literal text Social Security Number. Now of
course, that is not good enough because there might be a form that says social security
number or a letter that says social security number, but the number is not there itself. So what I
want to do is I want to write a Regular expression.

Now guys look if you like hieroglyphics and you are into that kind of thing, you will love regular
expressions otherwise they are, kind of, a pain. So if you look here, right, like...so how do I do
this, looking for a social security number? What I do is I say I am looking for values in the
range of 0-9, right? So I am looking for a number value and that number value occurs 3
times. So I use the squiggle brackets for that then there is a literal character, right, there will
always be a hyphen.

And then again in square brackets 0-9, we are looking for a numeric value and squiggle
brackets that occurs 2, right, squiggle. Then there is another literal character – there is the
hyphen. And then finally we are looking for 0-9, 4 times, dig it! That is how you write a social
security number, the regular expression to define...the variables used in regular expression to
define a social security number. And you can see I could go ahead and define a variable
expression for a phone number, credit card numbers, bank account numbers etc. Whatever it
is I am looking for, and frankly I mean there is a lot more to that you can do.

[File Server Resource Management (Local) Console is open and the Classification Parameters
dialog box is displayed, which includes the four columns Expression Type, Expression,
Minimum Occurrences, and Maximum Occurrences. The Expression Type column includes
two drop-down list entries, and from the drop-down list, the option for first entry is displayed as
String and the option for second entry is displayed as Regular Expression. For both the
entries, the value in Minimum Occurrences column is 1, and for the first entry, the Expression
column displays Social Security Number. Next for the first entry, in the Expression column, the
instructor enters Social Security Number. In the Expression column, for Regular Expression,
the instructor enters [0-9]{3}-[0-9]{2}-[0-9]{4}. As a result, a third entry is created in the
Expression Type, with the default option Regular Expression and the Minimum Occurrences
value 1. Then the instructor clicks OK, the Classification Parameters dialog box closes, and the
Classification tab of the Create Classification Rule dialog box is displayed.]

We have never run this before against this, so I want to re-evaluate the existing properties and
I want to clear anything that is on there already and overwrite with these new values. Now
there should not be anything because we have not done this before, right? Now when do we
want this thing to run? Well for that we are going to have to create a classification schedule.
And you can see in here, I can come in here and enable a fixed schedule. I can say that this
runs everyday at a certain time, change that to a PM, right? Maybe it runs daily.

Every evening we run, so that the new files that were created that day get classified, bump!
Bump! Or we can allow for continuous repli...classification of the files. And of course, we want
to generate log files, we want to generate errors, we want to be able to get reporting on this.
So that we know that it is executing on schedule and files are being classified. Again using
DAC, discretionary access control in Server 2012 R2, we can also use these file classifications
to granularly control file access, but that is a whole another piece.

[File Server Resource Management (Local) Console is open and the Classification tab of the
Create Classification Rule dialog box is displayed. The instructor clicks the Evaluation Type
tab, which includes the "Re-evaluate existing property values" checkbox that the instructor
selects. As a result, the section "When a conflict occurs between the new and the existing
value" is enabled, which includes the options "Overwrite the existing value" and "Aggregate the
values." From these options, "Aggregate the values" is already selected. The instructor selects
"Overwrite the existing value" and as a result, the following checkboxes are enabled: "Clear
Automatically Classified Property" and "Clear User Classified Property." The instructor selects
both the checkboxes, clicks OK, the Create Classification Rule dialog box closes, and the
Classification Rules page of the File Server Resource Manager (Local) Console is displayed.
Next in the navigation pane, the instructor right-clicks Classification Rules and from the
shortcut menu, clicks Configure Classification Schedule, which launches the File Server
Resource Manager Options dialog box. This dialog box includes the following tabs: File Screen
Audit, Automatic Classification, Access-Denied Assistance, Email Notifications, Notification
Limits, Storage Reports, and Report Locations. From these tabs, Automatic Classification is
selected by default. The Automatic Classification tab includes the "Enable fixed schedule"
checkbox, which the instructor selects and sets the Run at spin box to 8:05:04 PM. This tab
also includes the options Weekly and Monthly, from which Weekly is already selected and the
following checkboxes are displayed: Sunday, Monday, Tuesday, Wednesday, Thursday,
Friday, and Saturday. The instructor selects all the checkboxes and then selects the "Allow
continuous classification for new files" checkbox. The Automatic Classification tab also
includes the generate log section, which includes the Log file, Error log, and Audit log
checkboxes. From these checkboxes Log file and Error log checkboxes are already selected.
The tab also includes the Generate report section, which includes the "Generate a report"
checkbox that is already selected. The Generate report section also includes the following
checkboxes: DHTML, HTML, XML, CSV, Text, and Send reports to the following
administrators. From these checkboxes, DHTML is already selected.]

4. Demo: File management tasks


What we want to do is we want to be able to automate the processes of file management. One
of the things we are going to do is these file management tasks. So here in the File Server
Resource Manager under File Management Tasks, I'm going to right-click and I'm going to say
Create File Management Task. And now, in here, we are going to give this a name Expired
after one year. We have a directory. Files are placed in that directory. The nature of our
business is such that those files...after a year or probably never going to need them again,
right. You guys know 2% of everything you archive ever gets looked at ever again. But why do
we archive at all? We archive it for that 2%. So I want to clear these files out after a year, move
them off to some cheap storage some place off the file server, and let's see how we can
automate that process. So over here on the Scope we are going to specify the directory to
which these rules should apply.

[The File Server Resource Manager (Local) Console is open and in the navigation pane, the
following nodes are displayed: Quota Management, File Screening Management, Storage
Reports Management, Classification Management, and File Management Tasks. The
instructor right-clicks File Management Tasks and from the shortcut menu, clicks Create File
Management Task, which launches the Create File Management Task dialog box. This dialog
box includes the following tabs: General, Scope, Action, Notification, Report, Condition, and
Schedule. From these tabs, the General tab is already selected, in which the instructor enters
the name Expired after one year in the Task name field, and then clicks the Scope tab, which
displays the Scope tabbed page.]

And the types of files...now for our purposes, this is a directory where users create files. There
are no application files in there. There is no backup or archival files in there. And down here,
we have to specify where that is. Now on this file server in the C drive, users come in through
a Share and they hit this directory here...Good For a Year, right. Files placed in there are good
for a year. Over here, on the Action tab, you can see what we can do here, right? File
expiration is the common one. You can create Custom tasks in here and then you have got
RMS Encryption. Now if you are using Active Directory Rights Management Services, what
that does is it lets me create these templates that actually control what happens to the file that
their template is applied to, after it leaves my hands. If I want to say that this file is always and
forever read-only or it can't be attached to an e-mail message, it can't be forwarded as an
e-mail attachment, it can't be printed. I can define RMS templates that enforce that, after I have
shared the file with you...and I can automate the process of applying those RMS templates
here with the file management tasks. For our purposes, we are going to do File expiration.
[The File Server Resource Manager (Local) Console is open and the Create File Management
Task dialog box is displayed. In this dialog box, the Scope tab is already selected, which
includes the "Include all folders that store the following kinds of data" section consisting of
following checkboxes: Application Files, Backup and Archival Files, Group Files, and User
Files. From these checkboxes, the instructor selects User Files, and in the "The following
folders are included in this scope" section, the instructor clicks Add, which launches the
Browse For Folder dialog box. In this dialog box, the instructor expands Local Drive (C:), then
expands the Share folder, selects the Good For a Year folder, and clicks OK. As a result, the
Browse For Folder dialog box closes and the Scope tab of the Create File Management Task
dialog box is displayed. Next the instructor clicks the Action tab, which includes the Type drop-
down list, in which the option File expiration is already selected. The instructor opens the drop-
down list and the following options are displayed: File expiration, Custom, and RMS
Encryption. From these options, the instructor selects File expiration.]

Now the File expiration is going to require an Expiration directory. And again, normally, what I
would do is I would offload this to some less expensive storage some place on the backend.
For our purposes, I have got an Expired directory here that I'm going to stick em in, right. So
the file has not really been deleted. It is moved to an archive, and if we ever have to go back
and get it, we can. On the Notification page, I can shoot e-mails either to the user who
created the file that has been archived or to the administrators that they know these files have
been archived. This requires an SMTP virtual server be configured on the machine, alright
guys. If you ever try to do this and you have a problem, it is commonly because there is no
SMTP virtual server configured here to send that e-mail out. And over there, I can log events in
the Event Viewer or I can specify a script file to run to perform some other task, in addition to
this. Let's Cancel out of that. Over here, I can specify reporting, right? Is there a Log file? Is
there an Error log file? Do we audit here? How are the...what is the report format if we
generate reports? Who do they go to again? This requires an SMTP virtual server be
configured on this machine.

[The File Server Resource Manager (Local) Console is open and the Create File Management
Task dialog box is displayed. In this dialog box, the Action tab also is already selected. In the
Action tab, from the Type drop-down list, File expiration is already selected. The Action tab
includes Expiration directory field with a Browse button. The instructor clicks Browse and the
Browse For Folder dialog box is displayed. The instructor expands Local Drive (C:), then
expands the Share folder, selects the Expired folder, and clicks OK. As a result, the Browse
For Folder dialog box closes and in the Action tab, within the Expiration directory field, the
name C:\Share\Expired is displayed. Next the instructor clicks the Notification tab, which
includes four columns: Days Before Running Action, E-mail, Event Log, and Command. The
instructor clicks Add and the Add Notification dialog box is displayed. In this dialog box, the
"Specify what notifications to generate before running the task" section includes the Advance
notification (in days) spin box, which is already set to 15. The Add Notification dialog box also
includes three tabs: E-mail Message, Event Log, and Command. From these tabs, E-mail
Message is already selected, which consists of the "Send e-mail to the following
administrators" checkbox with a field that is disabled. The E-mail Message tab also consists of
the "Send an email to users with affected files" checkbox. The instructor clicks OK and the File
Server Resource Manager message box is displayed, which includes the message "You have
not selected any notification types. An event notification must contain at least one notification
type." The instructor clicks OK, the message box closes, and in Add Notification dialog box,
clicks Cancel. As a result, the Notification tab of the Create File Management task dialog box is
displayed. Then the instructor clicks the Report tab, which includes the Logging and Generate
a report sections. The Logging section includes three checkboxes: Log file, Error log file, and
Audit log file, from which Lof file and Error log file are selected. The Generate a report section
includes Report formats section, which consists of the following checkboxes: DHTML, HTML,
XML, CSV, and Text, from which DHTML is already selected. The Generate a report section
also includes the "Send reports to the following administrators" checkbox along with a text field
that is disabled.]

Here is where the actions are...at on the conditions tab. What do we want to specify? We want
to say 365 days. If this has been up here for 365 days, I want it gone. And you can see those
additional choices that I could make there. If it had not been modified, say in six months,
maybe I wanted out of there. If it had, if nobody has looked at it in three months, get it out of
there right, whatever the conditions are, for my organization and for my needs.

And then, over here, when does this process run? So I'm going to set this to...lets say Sunday
at 4:00 AM so that it does not conflict with our backup and other maintenance schedules,
update schedules, etcetera. And so what will happen is, every week at Sunday at 4:00 a.m.,
the process will run. It will take a look at any file that became a year old in the last week. It will
move it out of that folder and into the Expired directory. And here is a look at creating file
management tasks for the purposes of automating file management.

[The File Server Resource Manager (Local) Console is open and the Create File Management
Task dialog box is displayed. In this dialog box, the Report tab is already selected. Then the
instructor clicks the Condition tab, which includes the Property conditions section consisting of
four columns: Property, Operator, Value, and Offset. The Condition tab also includes the
following checkboxes: "Days since file was created," "Days since file was last modified," and
"Days since file was last accesses," all of which include a spin box that is disabled. The
instructor selects the "Days since file was created" checkbox and sets its spin box to 365.
Then the instructor clicks the Schedule tab, in which the Run at spin box is set to 6:26:09 AM,
and from the options Weekly and Monthly, Weekly is already selected. For Weekly, the
following checkboxes are displayed: Sunday, Monday, Tuesday, Wednesday, Thursday,
Friday, and Saturday, from which the instructor selects Sunday and sets the Run at spin box to
4:00 AM. Next the instructor clicks OK, the Create File Management Task dialog box closes,
and the File Server Resource Manager (Local) Console is displayed.]

5. Demo: Storage reports


One of the things that a lot of us that have been managing file servers in Windows for, you
know, over the years have wrestled with is getting good insight into how a file server is actually
being used. Now today in Server 2012 R2 with FSRM installed on my file server, I get Storage
Reports Management. So I can readily generate reports that give me insight into how this thing
is actually being used. And then, of course, I can deduce the remediation that I have to take,
so that it is used optimally. You know, so I will give this report a name, we will call this
General.

And down here, I can look at...there are these predefined reports to generate – Duplicate Files,
right? Do we have a lot of duplicate files? Could I delete nine out of ten and still have the data I
need for the business? File Screening Audit – have people been trying to save the wrong kinds
of files to our file servers? And if they have what is the nature of those files? Are we concerned
about the potential for malicious attacks?

[The File Server Resource Manager (Local) Console is open and in the navigation pane, the
following nodes are displayed: Quota Management, File Screening Management, Storage
Reports Management, Classification Management, and File Management Tasks. The
instructor right-clicks Storage Reports Management and from the shortcut menu, clicks
Schedule a New Report Task and the Storage Reports Task Properties dialog box is
displayed. This dialog box includes four tabs: Settings, Scope, Delivery, and Schedule, from
which the Settings tab is selected by default. In the Settings tab, within the Report name field,
the instructor enters the name General. The tab includes the Report data section, which
consists of the "Select reports to generate" list box with "Review Selected Reports" and "Edit
Parameters" buttons. The section also includes the maximum Number spin box, which is
already set to 1000. The "Select reports to generate" list box includes the following
checkboxes: Duplicate Files, File Screening Audit, Files by File Group, and Files by Owner,
which are all selected.]

Files based on File Group, right? Remember what a File Group is? A File Group is a group of
predefined set of file types. So executable files .exes, .cmd, .coms, .bat files, etc. right? So I
want to see files by group, files by owner, files by particular properties, right? If I created file
classification properties, I can search on those. Large files, least recently accessed, most
recently accessed, and quota usage – and so I can generate reports on any and all of these.
Maximum number of files to include in all storage reports, now that is a function of how many
files you have actually got, right? Let's say that what I am concerned about here really is
space, and so I just want to do a search for Large Files, right? And so if I have got any big
files and there that I can delete easily, you know, that will show up in this report.

[The File Server Resource Manager (Local) Console is open and the Settings tab of the
Storage Reports Task Properties dialog box is displayed. In the Settings tab, in the Report
data section, the "Select reports to generate" list box includes the following checkboxes:
Duplicate Files, File Screening Audit, Files by File Group, Files by Owner, Files by Property,
Folders by Property, Large Files, Least Recently Accessed Files, Most Recently Accesses
Files, and Quota Usage. From these checkboxes, the following are already selected: Duplicate
Files, File Screening Audit, Files by File Group, Files by Owner, Large Files, Least Recently
Accessed Files, Most Recently Accesses Files, and Quota Usage. The instructor clears all the
checkboxes except for the Large Files checkbox and clicks the Scope tab, which includes the
"Include all folders that store the following kinds of data" section consisting of following
checkboxes: Application Files, Backup and Archival Files, Group Files, and User Files.]

And here is the Scope, right? What kinds of files are we looking for? Well everything here
should be user files. And for this report I am going to generate it...not down at one of the
specific shares that, you know, we depend on but at the parent level. And I will get a report
based on the use of all of these critically important shares. Who does it go to? Do I want to
send an e-mail? Do I want an e-mail to report to somebody? Again guys for this to work, you
got to have an SMTP virtual server configured on this box. Those are configured using the
Internet Information Server, or IIS, 6.0 console, so you have got to install and configure them
there. How do I want this thing to run? I want it to run every Sunday, boom!

[The File Server Resource Manager (Local) Console is open and the Delivery tab of the
Storage Reports Task Properties dialog box is displayed. The instructor selects User Files and
clicks Add, which launches the Browse For Folder dialog box. In this dialog box, the instructor
expands the Local Drive (C:), selects the Share folder, and clicks OK. As a result, the dialog
box closes, and in the Scope tab of the Storage Reports Task Properties dialog box, in the
"The following folders are included in this scope" section, the name C:\Share is displayed in
the Folder column. Next the instructor clicks the Delivery tab of the dialog box, which includes
the "Send reports to the following administrators" checkbox with a field that is disabled. Next
the instructor clicks the Schedule tab, in which the Run at spin box is already set to 4:37:41
PM, and from the options Weekly and Monthly, Weekly is already selected. For Weekly, the
following checkboxes are displayed: Sunday, Monday, Tuesday, Wednesday, Thursday,
Friday, and Saturday, from which the instructor selects Sunday, clicks the Settings tab, and
then clicks OK.]

In here we see the details, I could edit these reports here, I could run the report now. Generate
reports in the background, wait for the reports to be generated then display them – let us go
ahead and do that. And we can see this thing working, we see it generating a report, it is going
out there, it is reading, it is walking the file system, and it is looking for any large files that might
be in there. Maybe it will give me a choice to delete them, real quickly. And if we open up this
HTML document, we can see Internet Explorer launches and we get a nice DHTML report.
And we see that in fact here there are no large files, so that is not going to be the solution to
my problem, right? I didn't really expect there to be but that is not going to be the solution to
my problem. But you can see it is a nice, neat, orderly report and I get immediate insight into
what is happening on that share.

[The Storage Reports management page of the File Server Resource Manager (Local)
Console is open. In this page, the report General is displayed with Report Target as Large
Files, Scope as C:\Share, and Schedule as 4:37 PM. The instructor right-clicks General and
from the shortcut menu, clicks Run Report Task Now, which launches the Generate Storage
Reports dialog box. This dialog box includes two options: "Generate reports in the
background," which is to view saved or e-mailed reports later, and "Wait for reports to be
generated and then display them," which is to view the reports immediately upon completion.
The instructor selects "Wait for reports to be generated and then display them" and clicks OK.
As a result, the dialog box closes, and the Generating Storage Reports dialog box is launched,
which displays the progress of Generating storage reports. The report is generated and the
Interactive window is displayed, which includes the folder LargeFiles2_2013-11-27_16-40-
45_files and the HTML file LargeFiles2_2013-11-27_16-40-45. The instructor opens the HTML
file in Internet Explorer, which displays the report. The report includes the Report Totals table,
which includes the columns Files and Total size on Disk for "Files shown in the report" and
columns Files and Total size on Disk for "All files matching report criteria." In all the four
columns, the value displayed is 0.]
Using BitLocker Encryption in Windows Server
2012 R2
Learning Objectives
After completing this topic, you should be able to
◾ match the BitLocker implementation option to its characteristics
◾ sequence the steps to configure network unlock

1. BitLocker drive encryption


Not that long ago I did some work for an insurance company, big insurance company, global
footprint, 100,000 users worldwide. And these guys lost on average one laptop a day, right?
So little over 300 laptops a year went missing, stolen out of airports, lockers, taxis, right? Just
gone...and on those laptops the proprietary information, the personally identifying information
of customers, clients, right? I mean a nightmare of legal problems can ensue from that kind of
loss.

Now guys, today we have BitLocker Drive Encryption. Now it is available on all the versions of
server and it is also available on the Windows 8 Pro and Enterprise client. For Windows 7, you
had to get Ultimate or Enterprise, it was not available in the Professional version. And
BitLocker Drive Encryption is just what it sounds like, it is full volume drive encryption. We
encrypt the entire contents of the volume. If it falls out of your hands, nobody can take that
drive out of the machine and walk the file system. Your data is secure.

Now how does BitLocker work? Well when the BIOS starts and initializes the TPM, that is the
Trusted Platform Module, that is a chip that is actually sorted to the motherboard. So if I am
interested in using BitLocker today, I want to buy hardware that has got that TPM chip. And
there is firmware on that chip and that is what secures the trusted chain of algorithms and keys
that are used to encrypt and decrypt the data.

Those trusted or measured components interact with the TPM to store component
measurements in the TPM's, Platform Configuration Registers, or PCRs. If the PCR values
match the expected values, the TPM uses the Storage Root Key, or SRK, to decrypt the
Volume Master Key, the VMK. The encrypted Full Volume Encryption Key, or FVEK, is read
from the volume and the decrypted Volume Master Key is used to decrypt it. The disk sectors
are decrypted with the Full Volume Encryption Key as they are accessed and plain-text data is
provided to applications and processes.

Now the common question that I get at this juncture is Murph, if the whole volume is encrypted
and every time I access a file it has got to be decrypted, what is the overhead on this thing?
And that is a fair question, right? What we see in the 32-bit world is about a 10% hit on
processor and memory. And in the 64-bit world about a 6%, 6 to 8% hit on processor cycle
utilization and memory resource allocation.
Now this has been around since Server 2008. We got it here in Server 2012 R2. And that is a
BitLocker To Go. I think about one of the security concerns that we all have, and that are
universal serial bus, or USB, devices. USB devices come and go. They are unmanaged, they
are unregulated, people can copy whatever they want on to them. So what can I do? Well
today, using Group Policy, I can define policies that force any USB in our place to be
encrypted with BitLocker To Go. BitLocker To Go will encrypt that USB device so that it is
portable, and with the BitLocker To Go reader, I can read it on backward compatible machines
right back to XP.

[The BitLocker To Go Reader dialog box includes a Type your password to unlock this drive
field.]

Folks, when you look at the BitLocker requirements, one of the first things that you see is a
TPM chip, but I have got to tell you, you don't need a TPM chip to use BitLocker. It is a more
secure implementation. It is what we would prefer but if you have machines, you really want a
bitlock, you can do it using a USB device. Now the problem with that is the USB has to be
present every time the machine starts up. And that is a problem, right? I mean look at me,
right? I just have this USB...look it...the thing is gone, it just snapped on me.

Now I was counting on this thing to start up my machine where would I be at. And that is why I
don't, I personally do not like the USB solution in any combination. Requiring the user to have
it is always going to give you problems in Helpdesk calls. What I want is a machine with the
TPM on it so that I get the firmware built right into the motherboard. And then, I require an
additional something from the user, a personal identification number, or PIN. Every time the
user wants to start the machine up...they start the machine up, not only must they provide their
credentials, they must provide that PIN.

So if the machine were to fall out of the users hands, and somebody, you know, the guy that
has stolen the machine had the user's password and ID, the guy still would not be able to get
in because he does not have that PIN. That is the solution that I like. These others are just
variations on that PIN. Either there is just a TPM or I use a TPM with the USB or I use all three
together, which is very burdensome for the average user.

When you bitlock a drive today, you are prompted to save the recovery key to a file. To print
the recovery key and to save the recovery key to a USB device on Windows 8 machine, you
are actually prompted to save it up to the Cloud, right, which is really interesting. And not to do
one of those but to do all three of them, why? Because if that machine fails, right, I have a
motherboard failure on that machine, I have a processor failure on that machine. Machine gets
run over...whatever. And I am going to take that drive out of the machine. I want to be able to
recover the data that is on that drive. The only way to do it, when I throw it in the USB
enclosure, it is going to prompt me for the recovery key. I have got to preserve those recovery
keys. Ideally, I store them in Active Directory and I define recovery agents.

What do you have to have on the box to use BitLocker? Well you have to have a supported
operating system version, right? And we know the server version supported. We know
Windows 8 Pro and Enterprise, Windows 7 Enterprise and Ultimate supported. I want to have
a TPM chip. Although we said we can get around the TPM requirement using a USB device,
but there are all kinds of problems with that, right? I want to have a TGC compliant BIOS or
UEFI firmware. TCG – that is Trusted Computing Group compliant BIOS, right? And then the
hard-disk has to be partitioned. You need a hidden system drive, which is not encrypted. And
then the boot drive or the drive that contains the Windows operating system files needs to be
formatted with NTFS.

2. Demo: Installing and using BitLocker


Here we are as you can see in the Add Roles and Features Wizard and BitLocker on the
server, I have to install the BitLocker Drive Encryption feature. So if we look over here we can
see on the features page, you have got BitLocker Drive Encryption then it will encourage us
to install some supporting software, right? The management tools for that and the remote
administration tools for that. Then there is this choice down here BitLocker Network Unlock.
And we are going to install that separately because that has got to be installed on a Windows
Deployment Services, or WDS, server. We have got to have a Windows Deployment Services
server to support that feature. So we go ahead and we will accept those choices, we will say
Install and we will let the installation run. And when this is done, we will then be in a position to
configure this local machine for BitLocker Drive Encryption.

[The Server Roles page of the Add Roles and Features Wizard is open. The page contains the
Roles section, which includes the following options:: Active Directory Certificate Services
Active Directory Domain Services Active Directory Federation Services Active Directory
lightweight Directory Services Active Directory Rights Management Services Application
Server DHCP Server DNS Server Fax Server File and Storage Services Hyper-V Network
Policy and Access Services Print and Document Services Remote Access Remote Desktop
Services On this page, the instructor clicks Next and moves to the Feature page of the wizard.
This page includes the following options: .NET Framework 3.5 Features .NET Framework 4.5
Features Background Intelligent Transfer Service (BITS) BitLocker Drive Encryption BitLocker
Network Unlock BranchCache Client for NFS Data Center Bridging Direct Play Enhanced
Storage Failover Clustering Group Policy Management IIS Hostable Web Core Ink and
Handwriting Services Next the instructor clicks BitLocker Drive Encryption and the Add Roles
and Features Wizard dialog box is displayed. The instructor then clicks the Add Features
button to close the dialog box and clicks Next. As a result, the Confirmation page of the Add
Roles and Features Wizard is displayed and the instructor clicks the Install button. The Results
page of the wizard is displayed along with the View Installation progress bar.]

When we think about and you can see guys this is a file server, right? This is our Windows
2012 R2 File Server. Let's imagine that this file server contains personally identifying
information of our clients. We have got credit card numbers, Social Security numbers, bank
account numbers, right, everything and anything that pertains to the privacy and personal
identity of our clients...could be stored in files on this machine. Now the problem is this
machine sits in a remote office out there in Saskatoon, there is only 15 sales representatives in
that office and one administrative assistant. That is it. And we have had problems in that office
before, right? Hardware has gone missing before, equipments gone missing before, we have
had break-ins just out, right, break-ins, it is a low security environment.

[The Results page of the Add Roles and Features Wizard is displayed along with the View
Installation progress bar.]

I want to be able to say to my clients and customers and to executive management if this
server were to go missing, we are not at any risk and having those drives encrypted with
BitLocker is going to let me go to executive management with confidence in the event that this
server does go missing, I can go to them with confidence and I can say, ladies and gentlemen,
we are not exposed because of this loss. And that is what executive management wants to
hear.

As you can see this installation is going to require a reboot of this machine, so I am going to go
ahead and restart the machine now. Once this machine restarts, we can come out here to the
Start page right so, right away I know I am on a Windows 2012 R2 box as I have this little Start
shortcut right, that brings us back out here. I could go into the Control Panel, there will be a
BitLocker choice in the Control Panel or I can search for BitLocker and I get the Manage
BitLocker tool and BitLocker Drive Encryption opens.

[The Results page of the Add Roles and Features Wizard is displayed along with the View
Installation progress bar. After a while, the instructor restarts the machine and navigates to the
Server Manager Dashboard. The partially displayed Server Manager Dashboard contains a
ROLES and SERVER GROUPS section, which includes two sub-sections as follows: Local
Server and All Servers. The instructor clicks the Start shortcut to navigate to the Start page.
Next the instructor opens BitLocker Drive Encryption from the Search and the BitLocker Drive
Encryption window is displayed.]

And now here I can see the BitLocker is off, the C drive is the operating system volume on this
machine, right? It is the only volume available on this machine, I want to go ahead and encrypt
that drive. How do I unlock the drive at startup? Now guys normally, of course, we would
require a TPM chip, right? Normally this machine would have a TPM chip and that is how we
would do pre-boot authentication and security. But these are virtual machines, so they don't
have a TPM chip. So I am going to be required to either enter a password or have a USB key
at startup every time because it is going to have to decrypt the contents of the operating
system volume in order to launch the machine, right? The only way to do that there is a Group
Policy setting that has got to be enabled and that allows for BitLocker on machines without a
TPM. We will take a look at that in a later demo. I will show you where that is.

[The BitLocker Drive Encryption window is open and includes an Operating system drive
section. This section includes Turn on BitLocker link. The instructor clicks the Turn on
BitLocker link and the BitLocker Drive Encryption (C:) dialog box is displayed along with the
Checking your PC's configuration progress bar. After a while, Choose how to unlock your drive
at startup page of the dialog box is displayed. The page includes two links as follows: Insert a
USB flash drive and Enter a password. Next the instructor clicks the Enter a password link and
then enters password in the Enter your password and Reenter your password fields and clicks
Next. As a result, the “How do you want to back up your recovery key?” page of the dialog box
is displayed. The page contains three links as follows: Save to a USB flash drive, Save to a
file, and Print the recovery key.]

Now here we want to save this thing, right? I want to say really what I would want to do in real
life is I would want to...I would want to do all of those things, right? I would want to save it to a
file like I am doing here. Now in real life guys I would want to do every one of these things. I
would want to save it to a USB drive, I would want to save it to a file that file should be on a
network share someplace, right? I want to Print the recovery key and I am going to go ahead
and print that recovery key, call it key. And I want to know, you know, this I would want to
save this in encrypted locations somewhere out on the network, this I would want to save in a
fire-proof safe on site, the one that I store off to the USB, I might ship to an off-site location
where it is saved in a fire-proof safe right? Go ahead and run the system check, make sure we
are good to go, we are warned here that encryption will begin after computer restart.

[The “How do you want to back up your recovery key?” page of the BitLocker Drive Encryption
(C:) dialog box is open. The page contains three links as follows: Save to a USB flash drive,
Save to a file, and Print the recovery key. The instructor clicks the Save to a file link and the
Save BitLocker recovery key as dialog box is displayed. Next the instructor closes the Save
BitLocker recovery key as dialog box and navigates back to the “How do you want to back up
your recovery key?” page. The instructor then clicks the Print the recovery key link. As a result,
the Print dialog box is displayed and the instructor clicks Print and the Save Print Output As
dialog box is displayed. The instructor then enters 'key' in the File name field and clicks Save,
and navigates back to the “How do you want to back up your recovery key?” page and clicks
Next. As a result, the “Are you ready to encrypt this drive?” page of the BitLocker Drive
Encryption (C:) dialog is displayed and the instructor clicks the Continue button and navigates
back to the BitLocker Drive Encryption window. This window displays the "Encryption will begin
after computer restart" message.]

So I will go ahead and I will reboot this machine again. And now when this machine restarts
that process of drive encryption will happen automatically. But you will notice that I can't reboot
the machine in order to encrypt the drive until I have entered that password to unlock the drive.
I have successfully entered the password; machine will reboot and then the process of
encrypting the contents of the volume will begin.

Now my machine has restarted and we go back into the Manager BitLocker tool. And here you
can see BitLocker is encrypting, right? That process will chug along. Once this has completed,
the drive will be fully encrypted.

[The BitLocker Drive Encryption window is open and the instructor enters the password in the
Enter the password to unlock this drive field. As a result, the machine reboots. After a while,
the R2Fileserver1 on TARDIS1 – Virtual Machine Connection window is displayed.]

3. Demo: BitLocker group policy


Like everything in Windows how do we manage it? We manage it with Group Policy, so you
can see here we are in the Group Policy Management Console. You can see what I have
done, I have created a Group Policy called BitLocker Client up at the domain level. If I wanted
to have different policies for my Servers, for my Workstations, I could create different BitLocker
policies at any organizational unit level. But for our purposes a single BitLocker policy across
the Active Directory domain should be just fine. So I am going to right-click this BitLocker
Client, I am going to go into Edit. And once we come into the Group Policy Management
Editor, we are going to drill down into the Computer Configuration – Policies – Administrative
Templates – Windows Components, BitLocker there it is.

And we see in there that there is different categories right? There is separate settings for Fixed
Data Drives, Operating System Drives, and Removable Data Drives for anybody who is testing
you want to spend time looking at those removable data drive choices especially. And know
that they are in a separate subfolder. I want to highlight a thing here for the purposes of
testing, right? I am just going to sort these real quick. I don't know why they are not in
alphabetical order at the first place but anyway.
[The Group Policy Management console is open. The navigation pane of the window contains
the following nodes and sub-nodes: Group Policy Management Forest:corp.brocadero.com
Domains Corp.brocadero.com BitLocker Client Certificate Services Default Domain
Policy DirectAccess Client… DirectAccess Server Domain Controllers Local Resource
Access… People Servers Workstations Client cert Serv WSUS HQ Kiosk
RemoteOffice1 Group Policy Object WMI Filters The instructor right-clicks the BitLocker
Client and selects Edit from the shortcut menu. As a result, the Group Policy Management
Editor window is displayed. The navigation pane of the Group Policy Management Editor
window includes the following nodes and subnodes: Computer Configuration Policies
Software Settings Windows Settings Administrative Templates Preferences User
Configuration Policies Preferences Next the instructor expands the Administrative Templates,
Windows Components, and selects BitLocker Data Encryption. The following folders and
settings are displayed in the view pane: Fixed Data Drives Operating System Drives
Removable Data Drives Choose default folder for recovery password Choose drive encryption
method and cipher strength Choose drive encryption method and cipher strength (Wind…
Choose how a user can recover BitLocker-protected drives (… Prevent memory overwrite on
restart Provide the unique identifiers for your organization Store BitLocker recovery information
in Active Directory Domain Services settings Validate smart card certificate usage rule
compliance The instructor now double-clicks the Operating System Drives, which contains the
following files: Allow enhanced PINs startup Allow network unlock at startup Allow secure boot
for integrity validation Choose how BitLocker-protected operating system drives ca…
Configure minimum PIN length for startup Configure TPM platform validation profile
(EWindows Vista, … Configure TPM platform validation for BIOS-based fi… Configure TPM
platform validation profile for native UEFI fir… Configure use of hardware-based encryption for
operating s… Configure use of passwords for operating system drives Disallow standard users
from changing the PIN or password Enable use of BitLocker authentication requiring preboot
ke… Enforce drive encryption type on operating system drives Require additional
authentication at startup Require additional authentication at startup (Windows Serve… Reset
platform validation data after BitLocker recovery Use enhanced Boot Configuration Data
validation profile]

I want to highlight a setting in here guys, remember when we did the BitLocker demo and we
encrypted that drive, it was on a machine that was a virtual machine, not a best practice but
great for, you know, demonstration purposes and for testing. And the way that I made that
happen because no virtual machine has a TPM, right? That is a physical component and that
does not exist. So what I did was I allowed BitLocker without a compatible TPM and that is the
first policy setting that I have to enable if I have got machines out there that I want to employ
BitLocker on but don't have the required hardware. Now for me personally when it comes to
BitLocker, I like BitLocker with a PIN, so the user has to enter not just their username and
password but they also have to enter a PIN, something known only to them. And the way that I
force that setting is here Allow startup PIN with TPM.

[The Operating System Drives folder is open, which contains the following files: Allow
enhanced PINs startup Allow network unlock at startup Allow secure boot for integrity
validation Choose how BitLocker-protected operating system drives ca… Configure minimum
PIN length for startup Configure TPM platform validation profile (EWindows Vista, … Configure
TPM platform validation for BIOS-based fi… Configure TPM platform validation profile for
native UEFI fir… Configure use of hardware-based encryption for operating s… Configure use
of passwords for operating system drives Disallow standard users from changing the PIN or
password Enable use of BitLocker authentication requiring preboot ke… Enforce drive
encryption type on operating system drives Require additional authentication at startup
Require additional authentication at startup (Windows Serve… Reset platform validation data
after BitLocker recovery Use enhanced Boot Configuration Data validation profile The
instructor opens the Require additional authentication at startup window, which contains three
options – Not Configured, Enabled, Disabled, and an Options section. The Option section
includes an Allow BitLocker without a compatible TPM checkbox, and drop-down list boxes as
follows: Configure TPM startup, Configure TPM startup PIN, Configure TPM startup key, and
Configure TPM startup key and PIN. The Configure TMP startup is already set as Allow TPM,
Configure TPM startup is set as Allow startup PIN with TPM, Configure TPM startup key as Do
not allow a startup key with TPM, and Configure TPM startup key and PIN as Do not startup
key and PIN with TPM.]

Now you may be saying Allow startup PIN with TPM well, wait a minute Murph, you just
allowed this for machines without a TPM. What does that mean? Well when I check this little
checkbox, you see right here it says it requires a password or startup key, right? It is going to
force that user to assign a PIN. If there is no TPM chip...if there is a TPM, this option Allow
startup PIN with TPM will give the user the option of setting a PIN as well. If they have a TPM
chip though I still want them to have a PIN so I am going to require the PIN, dig it.

Now over here let's come back up to the root of this and take a look at a couple of things real
quick. One of the things that I want to do is I want to store BitLocker recovery information in
Active Directory...oh, heck I do, you know, I do, right. If I don't do that, recovery becomes
problematic. We will talk more about recovery in the next demo. We do that and here I want to
create out on the network what I would do is I would create a folder share and then I would
create the path here specify the path and then the users recovery passwords will be saved up
to that network share. These are some of the critical Group Policy settings I want to enable in
order to manage BitLocker across my network.

[The Require additional authentication at startup window is open and contains three options –
Not Configured, Enabled, and Disabled and an Options section. The Option section includes
the an Allow BitLocker without a compatible TPM checkbox, and drop-down list boxes as
follows: Configure TPM startup, Configure TPM startup PIN, Configure TPM startup key, and
Configure TPM startup key and PIN. The instructor selects Require startup PIN with TPM in
the Configure TPM startup, Configure TPM startup PIN drop-down and clicks OK and
navigates back to the BitLocker Drive Encryption folder, which contains three folders as
follows: Fixed Data Drives, Operating System Drives, and Removable Data Drives and the
following settings: Choose default folder for recovery password Choose drive encryption
method and cipher strength Choose drive encryption method and cipher strength (Wind…
Choose how a user can recover BitLocker-protected drives (… Prevent memory overwrite on
restart Provide the unique identifiers for your organization Store BitLocker recovery information
in Active Directory Domain Services settings Validate smart card certificate usage rule
compliance The instructor clicks the Store BitLocker recovery information in Active Directory
Domain Services settings and the Store BitLocker recovery information in Active Directory
Domain Services settings window is displayed. The instructor then selects the Enabled option,
clicks OK, and navigates back to the BitLocker Drive Encryption folder. Next the instructor
clicks the Choose default folder for recovery password setting. As a result, the Choose default
folder for recovery password window is displayed.]
4. Demo: BitLocker recovery options
Here we want to take a look at some of the options for managing recovery. And the first thing
that we have to do before we let anybody use BitLocker is we set up the organization class
IDs. These are unique identifiers for my organization. Now guys again you can do this on an
organizational unit, or OU, level. In our example here I am doing it for the entire organization,
so I am simply going to make it easy, right? Our organization identifier is our name
Brocadero. And this identifier, now once this Group Policy setting is applied anybody that
BitLocks a drive in the place gets that organization ID applied to that BitLocker installation.

Now dig it, that is going to mean that our recovery agent will be able to decrypt that drive in the
event of recovery. Okay, what if you did this but you knew that there were machines out there
that were already BitLocked? How can you go back and bring them into the fold so that our
recovery agent certificate-based decryption will work? Well I go out...well I actually...you can
see it, it is in the notes here someplace, I am sure there it is. But what you do is you go out to
the machines that are already BitLocked. You launch the manage BDE utility and you use the
setclassid switch setclassid Brocadero to identify that machine. Now there are
other things that we want to do here especially when we think about our operating system
drives.

[The BitLocker Drive Encryption folder is open and the following folders and settings are
displayed in the view pane: Fixed Data Drives Operating System Drives Removable Data
Drives Choose default folder for recovery password Choose drive encryption method and
cipher strength Choose drive encryption method and cipher strength (Wind… Choose how a
user can recover BitLocker-protected drives (… Prevent memory overwrite on restart Provide
the unique identifiers for your organization Store BitLocker recovery information in Active
Directory Domain Services settings Validate smart card certificate usage rule compliance The
instructor clicks the Provide the unique identifiers for your organization setting and the Provide
the unique identifiers for your organization window is displayed. The window contains three
options as follows: Not Configured, Enabled, Disabled and an Options section. The Options
section includes two fields as follows: BitLocker identification and Allowed BitLocker
identification. The Not Configured option is already selected. Next the instructor clicks the
Enabled option and enters Brocadero in the BitLocker identification field. After a while, the
instructor clicks OK and navigates back to the BitLocker Drive Encryption folder and opens the
Operating System drives folder.]

So, if I come in here I can see Choose how BitLocker-protected operating system drives can
be recovered and I am going to enable this setting and I am going to Allow data recovery
agent, right? Again we got to have a data recovery agent for this to work. I am going to Allow
these 48 digit recovery passwords, 256 bit recovery keys, because I am taking charge of all
the BitLocker management up here at the domain level I might choose Omit recovery options
from the BitLocker setup wizard, don't even present them to the user, right?

Lastly here I want to highlight this if we come up there is other settings up here under Windows
Settings – Security Settings – Public Key settings, I see this BitLocker Drive Encryption setting
and I see that there is no data recovery agent. What I would do is I would come in here, I could
browse the directory, I could go out, and I could search for my BitLocker Recovery Agent,
right? See I have this user created. And I would provision this user with a certificate and then
that user and that certificate become a master decryption key for everybody that is in the
Brocadero BitLocker domain.
[The Operating System Drives folder is open and contains the following settings: Allow
enhanced PINs startup Allow network unlock at startup Allow secure boot for integrity
validation Choose how BitLocker-protected operating system drives can be recovered
Configure minimum PIN length for startup Configure TPM platform validation profile (Windows
Vista, … Configure TPM platform validation for BIOS-based fi… Configure TPM platform
validation profile for native UEFI fir… Configure use of hardware-based encryption for
operating s… Configure use of passwords for operating system drives Disallow standard users
from changing the PIN or password Enable use of BitLocker authentication requiring preboot
ke… Enforce drive encryption type on operating system drives Require additional
authentication at startup Require additional authentication at startup (Windows Serve… Reset
platform validation data after BitLocker recovery Use enhanced Boot Configuration Data
validation profile The instructor clicks the Choose how BitLocker-protected operating system
drives can be recovered setting. As a result, the Choose how BitLocker-protected operating
system drives can be recovered window is displayed. The window contains three options as
follows: Not Configured, Enabled, Disabled and a section – Options. The Options section
includes the following checkboxes: Allow data recovery agent, Omit recovery options from the
BitLocker setup wizard, Save BitLocker recovery information to AD DS for operating system
drives, Do not enable BitLocker until recovery information is stored to AD DS for operating
system drives and drop-down list boxes: Configure user storage of BitLocker recovery
information and Configure storage of BitLocker recovery information to AD DS. The Not
Configured option is already selected and the instructor selects the Enabled option. Next the
instructor checks the Omit recovery options from the from the BitLocker setup wizard, clicks
OK, and navigates back to the Operating System Drives folder. Next the instructor right-clicks
the BitLocker Drive Encryption subnode from the navigation pane and selects Add Data
Recovery Agent from the shortcut menu. As a result, the Add Recover Agent Wizard is
displayed and the instructor clicks Next. The Select Recovery Agents page of the wizard is
displayed. The page contains two buttons as follows: Browse Directory and Browse Folders.
The instructor clicks the Browse Directory button and the Find Users, Contacts, and Groups
dialog box is displayed. Next the instructor enters bitlock in the Name field and clicks the Find
Now button and the following details is displayed in the Search results section: Name
BitLocker Recover Agent Type User Description]

5. Demo: BitLocker network unlock


Now ladies and gentlemen, if you enforce BitLocker, and you enforce the use of a PIN, what
happens is people get tired of putting in the PIN that is the bottom-line. You know, you get
calls, Murph, every time the machine goes to sleep. I got to put in the PIN, right? Every time it
wakes up from hybrid, I got to put in the PIN, right? Why do I have to put in the PIN every time,
is not there a better way?

Well today with Server 2012 and 2012 R2 we support a feature called Network Unlock. And so
on my networks where I have got domain, right, so when I am in the office and there is a
domain controller present to authenticate. I can skip the PIN portion, basically we use a service
and that service provides a PIN for the purposes of simplifying end-user input, right? That is all
it is guys, that is all it is. Now the trick with this is you need a WDS server, and that is where
the Network Unlock feature has to be installed. So test tip for all you test takers out there. If
you got a bunch of servers in your place and you are looking to deploy the Network Unlock
feature, what server are you going to install it on, the WDS server.
Now folks here you can see we are sitting on a WDS server, right? You will notice that this
server is not a DHCP server. You must have a separate DHCP server available on the network
in order to install the Network Unlock feature...can't be on the WDS server. And here we are
under Features and we see there is the BitLocker Network Unlock, go ahead and we will
select that and hit Next there and we will tell it to Install.

So a couple of things about this guys, right? It is only available when the client is on the
network, right, when there is a domain controller present for authentication. Otherwise, when
the machine goes to sleep or in the hibernation mode, it is going to require the PIN. It is when
it comes back to life. And clearly this means that we ought to be requiring a PIN plus the TPM,
right?

This is really only available for Windows 8 and Windows Server 2012 machines that have a
UEFI compliant BIOS.

[The Server Manager WDS window is open. The toolbar of the window contains menus as
follows: Manage, Tools, View, and Help. The instructor right-clicks Manage and selects Add
Roles and Features from the shortcut menu and the Add Roles and Features Wizard is
displayed and the instructor clicks Next. The Installation Type page of the wizard is displayed
and the instructor clicks Next. As a result, the Server Selection page of the wizard is displayed
and the instructor clicks Next and the Feature page of the wizard is displayed, which contains
the following options: .NET Framework 3.5 Features .NET Framework 4.5 Features
Background Intelligent Transfer Service (BITS) BitLocker Drive Encryption BitLocker Network
Unlock BranchCache Client for NFS Data Center Bridging Direct Play Enhanced Storage
Failover Clustering Group Policy Management IIS Hostable Web Core Ink and Handwriting
Services Next the instructor checks the BitLocker Network Unlock checkbox and clicks Next.
As a result, the Confirmation page of the Add Roles and Features Wizard is displayed and the
instructor clicks the Install button. The Results page of the Add Roles and Features Wizard is
displayed along with a progress bar. The instructor clicks Close to close the wizard and
navigates to the Server Manager WDS window.]

And we are going to go into the Group Policy Management Editor and you can see here where
we are right now there are two places where BitLocker settings are to deploy the BitLocker
certificate for Network Unlock. I am up here under Windows Settings – Security Settings –
Public Key Policies – BitLocker Drive Encryption unlock certificate. And what I would have to
do is I would have to come in here into this wizard this is just like specifying a recovery agent. I
go ahead and I grab the user whose certificate will be used to generate all of the Network
Unlock keys for all the clients, boom.

Now down here now there is more to this right, if we come back down here into BitLocker Drive
Encryption and again remember where that setting was, it was in Operating Systems, Require
additional authentication at startup, right? The startup key I have to have a USB device with
me all the time to get the machine even started, that is not good, I don't want that. But I do
want the user to have a PIN that they remember, right? And so down there we are going to
save Require startup PIN with TPM, which is the recommended though not required setting
when using Network Unlock.

[The Server Manager WDS window is open and the window partially displayed. The instructor
opens the Group Policy Management Editor window. The navigation pane of the Group Policy
Management Editor window includes the following nodes and subnodes: Computer
Configuration Policies Software Settings Windows Settings Name Resolution Policies
Scripts (Startup/Shutdown… Security Settings Account Policies Local Policies Event
Log Restricted Groups System Services Registry File System Wired Network (IEEE…
Windows Firewall … Network List… Wireless Network… Public Key Policies Encrypting
File Data Protection BitLocker Drive Encryption BitLocker Drive Encryption Network
Unlock Certificate The instructor right-clicks the BitLocker Drive Encryption Network Unlock
Certificate subnode and selects Add Network Unlock certificate from the shortcut menu. As a
result, the Add Network Unlock Certificate Wizard is displayed and the instructor clicks Next.
The Select Network Unlock Certificate page of the wizard is displayed. The instructor clicks
Cancel to cancel the page and navigates to the BitLocker Drive Encryption folder, which
includes Fixed Data Drives, Operating System Drives, Removable Data Drives folders and the
following settings: Choose default folder for recovery password Choose drive encryption
method and cipher strength Choose drive encryption method and cipher strength (Wind…
Choose how a user can recover BitLocker-protected drives (… Prevent memory overwrite on
restart Provide the unique identifiers for your organization Store BitLocker recovery information
in Active Directory Domain Services settings Validate smart card certificate usage rule
compliance Next the instructor opens the Operating System Drives folder, which includes the
following settings: Allow enhanced PINs startup Allow network unlock at startup Allow secure
boot for integrity validation Choose how BitLocker-protected operating system drives ca…
Configure minimum PIN length for startup Configure TPM platform validation profile
(EWindows Vista, … Configure TPM platform validation for BIOS-based fi… Configure TPM
platform validation profile for native UEFI fir… Configure use of hardware-based encryption for
operating s… Configure use of passwords for operating system drives Disallow standard
users from changing the PIN or password Enable use of BitLocker authentication requiring
preboot ke… Enforce drive encryption type on operating system drives Require additional
authentication at startup Require additional authentication at startup (Windows Server… Reset
platform validation data after BitLocker recovery Use enhanced Boot Configuration Data
validation profile Next the instructor opens the Require additional authentication at startup
(Windows Server…) settings and the Require additional authentication at startup (Windows
Server…) window is displayed. The partially displayed window contains an Options section,
which includes Allow BitLocker without compatible TPM checkbox, Configure TPM startup key
and Configure TPM startup PIN drop-down list boxes. The Allow BitLocker without compatible
TPM checkbox is already checked, the Configure TPM startup key is already set to Do not
allow startup key and Configure TPM startup PIN is already set to Require startup PIN with
TPM. Next the instructor clicks OK.]
Configuring EFS in Windows Server 2012 R2
Learning Objective
After completing this topic, you should be able to
◾ recognize EFS management best practice

1. EFS overview
Built into the NTFS file system from the very beginning has been a user-based file encryption
methodology called EFS, the Encrypting File System. And it is a great thing, in so many ways it
is a great thing, right, because there are no special hardware requirements. It is built right in
anybody who has got a Windows machine and an NTFS partition...can right-click any file or
folder going to those Advanced properties and click that little Encrypt button...bang and it is
encrypted. And the beauty of that is that nobody can get at it except the user who encrypted it.

Now, of course, the user can choose to share that with other folks, but by default, out of the
box, the only person who can see it is the user who encrypted it. And it is transparent,
completely transparent to the user. The user wants to see the file. They open it up, they see
it...boom, just like that. Now there are some shortcomings with this. This is file-based, not disk-
based. And so we get no protection in that preboot environment. As a file encryption solution,
right, it is well tested, well proved, and it is built right in...does not cost you anything else for
licensing and...but we want to think a little bit about the management of it.

2. Demo: Enabling EFS


EFS, of course, is a function of the local file system and so I want to get control over it using
Group Policy at the domain level. Otherwise, the EFS policies that exist on every local
machine are controlled locally and I don't want that. I want centralized management. The
particular concern there is that if the user certificate on the local machine is lost or becomes
corrupt then we are in trouble. So what I want to do is I want to come in here to Group Policy,
you can see where the Default Domain Policy...because I want this setting to apply across the
network, right? If I wanted separate policies for individual organizational units, I could certainly
do that. I am going to come down here under Security, Public Key Policies and there is the
Encrypting File System.

[The Group Policy Management Window is open and includes the following options in the
navigation pane: Group Policy Management Forest: corp.brocadero.com Domains
Corp.brocadero.com Default Domain Policy Contractors Domain Controllers Resource
Access Groups Servers Group Policy Objects WMI Files Starter GPOs Sites Group Policy
Modelling Group Policy Results The instructor right-clicks the Default Domain Policy option
and selects edit from the shortcut menu. As a result, the Group Policy Management Editor
window is displayed. The partially displayed navigation pane of the Group Policy Management
Editor window includes the following options: Computer Configuration Policies Software
Settings Windows Settings Name Resolution Policies Scripts (Startup/Shutdown…
Security Settings Account Policies Local Policies Event Log Restricted Groups System
Services Registry File System Wired Network (IEEE… Windows Firewall … Network
List… Wireless Network… Public Key Policies Encrypting File System Data Protection
BitLocker Drive Encryption BitLocker Drive Encryption Network Unlock Certificate The
instructor then clicks the Encrypting File System option and the following information is
displayed in the view pane: Issued To Administrator Issued By Administrator Expiration Date
10/1/2013 Intended Purposes File Recovery]

And in here we can see here is the Administrator account. This Administrator account is the
data recovery agent. If I click in here and say Add Data Recovery Agent right, I could come
out here to the directory. I could browse the directory and I could find who I have got out here
that could act as the Data Recovery Agent. And in our example, I have already added that
Administrator account so that I can show you these settings. Here if we come into Properties
for EFS, we can choose Don't allow. If we don't have a Public Key Infrastructure, or PKI, if we
don't have a way to manage certificates across the network for EFS, I could choose to block it
right here and then nobody is going to be able to use EFS for file encryption. I can choose to
Allow it and when we allow it, we get some things down here, right?

Now I am not using the right kind of certificates to allow this high level of Elliptic Curve
Cryptography, but that is okay because what I get from, you know, the default self-signed
certificates for this test purpose is just fine, right. I am going to allow, I am going to allow if it is
available and then down here we have some additional choices, Encrypt the contents of the
user's Documents folder. I just want to encrypt those babies, right? Use the user's local EFS
certificate that we are going to provide for them from Active Directory Certificate Services
Autoenrollment policies.

[The Group Policy Management Editor window is open and includes the following options:
Computer Configuration Policies Software Settings Windows Settings Name Resolution
Policies Scripts (Startup/Shutdown… Security Settings Account Policies Local Policies
Event Log Restricted Groups System Services Registry File System Wired Network
(IEEE… Windows Firewall … Network List… Wireless Network… Public Key Policies
Encrypting File System Data Protection BitLocker Drive Encryption BitLocker Drive
Encryption Network Unlock Certificate The instructor right-clicks the Encrypting File System
and selects Add Data Recovery Agent from the shortcut menu. As a result, the Add Recovery
Agent Wizard is displayed and the instructor clicks Next. The Select Recovery Agents page of
the wizard is displayed. The page contains two buttons as follows: Browse Directory and
Browse folder. Next the instructor clicks the Browse Directory button. As a result, the Find
Users, Contacts, and Groups dialog box is displayed. The instructor then clicks the Find Now
button in the dialog box the following information is displayed in the Search results section:
Name IIS_IUSRS Distributed CO… Performance Lo… Performance M… Domain Users
Domain Admins Type Group Group Group Group Group Group Description Built-in group used
by Internet… Members are allowed to launch… Members of the group may… Members of the
group can All domain users Designated administrators The instructor clicks Close to close the
dialog box and navigates back to the Add Recovery Agent Wizard, and clicks Cancel to close
the wizard. Next the instructor right-clicks the Encrypting File System and selects Properties
from the shortcut menu. As a result, the Encrypting File System Properties dialog box is
displayed, which contains three tabs as follows: General, Certificate, and Cache. The General
tab is already open and the general tabbed page includes three sections as follows: File
encryption using Encrypting File System (EFS), Elliptic Curve Cryptography, and Options. The
File encryption using Encrypting File System (EFS) includes three options as follows: Not
defined, Allow, Don’t allow. The Elliptic Curve Cryptography section includes three options as
follows: Allow, Require, Don't allow. The Options section includes four checkboxes as follows:
Encrypt the contents of the user's Documents folder, Require a smart card for EFS, Create
caching-capable user key from smart card, and Display key backup notifications when user
key is created or changed. The Not defined option under the File encryption using Encrypting
File System (EFS) is already selected, the instructor selects the Allow option. The Create
caching-capable user key from smart card checkbox is already checked in the Options section
and the instructor checks the Encrypt the contents of the user's Documents folder.]

If we are using smart cards, I can require a smart card for EFS, I can create caching-capable
user keys from the smart card, right? Again these choices are dependent upon the existence
of smart cards, which we don't use in our organization. If I want the user to see that backup
notifications when the user key is created or changed, I can check that. But if I am using Active
Directory Certificate Services and Autoenrollment policies, it is less important to me because
this is all happening transparently in the background.

Over here I can allow EFS to generate those self-signed certificates when a certification
authority is not available, you know, as is the case here right, we don't have that here. And I
can set a key size for the RSA self-signed certificate. I want to use those 2048 keys in
everything I do these days, right? That is what I want to use. And key size for Elliptic Curve
Cryptography, when available...and then down here clear encryption key cache after this
timeout, if I want to set that and I can also force it when the user locks the workstation.

[The Encrypting File System Properties dialog box is open and contains three tabs as follows:
General, Certificate, and Cache. The General tab is already open and the general tabbed page
includes three sections as follows: File encryption using Encrypting File System (EFS), Elliptic
Curve Cryptography, and Options. The File encryption using Encrypting File System (EFS)
includes three options as follows: Not defined, Allow, Don't allow. The Elliptic Curve
Cryptography section includes three options as follows: Allow, Require, Don't allow. The
Options section includes four checkboxes as follows: Encrypt the contents of the user's
Documents folder, Require a smart card for EFS, Create caching-capable user key from smart
card, and Display key backup notifications when user key is created or changed. The instructor
unchecks the Create caching-capable user key from smart card checkbox in the Options
section. Next the instructor clicks the Certificates tab. As a result, the certificates tabbed page
is displayed, which contains a EFS template for automatic certificate requests field, an Allow
EFS to generate self-signed certificates when a certification authority is not available
checkbox, and two drop-down list boxes as follows: Key size for RSA self-signed certificates
and Key size for Elliptic Curve Cryptography self-signed certificates. The instructor then clicks
the Cache tab. As a result, the Cache tabbed page is displayed and contains two checkboxes
– Cache encryption key cache when and User locks workstation and a Cache timeout spin
box.]

And so these are the choices that are available. Once I have deployed a certificate to
somebody who can then act as my Data Recovery Agent, we can go ahead and choose to
allow EFS across the network. We can choose to automatically Encrypt the contents of the
user's Documents folder. But more importantly, what we have really got to do is engage in
some kind of user education so that folks understand when and how they should apply EFS.
BitLocker, today, on those supported clients should really replace EFS as our choice so that
we don't have to think about it, we don't have to worry about it, we don't have to depend on the
users. If I have client machines that are not BitLocker capable then for those machines I would
apply EFS policies.

[The Cache tabbed page of the Encrypting File System Properties dialog box is displayed. The
page contains two checkboxes – Cache encryption key cache when and User locks
workstation, and a Cache timeout spin box. The Cache timeout occurs checkbox is already
selected. The instructor clicks the General tab and clicks OK, and navigates to the Encrypting
File System option, which displays the following information: Issued To Administrator Issued
By Administrator Expiration Date 10/1/2013 Intended Purposes File Recovery The instructor
right clicks the Encrypting File System option from the navigation pane and selects Properties
from the shortcut menu. As a result, the Encrypting File System properties dialog box is
displayed. The dialog box contains three tabs as follows: General, Certificate, and Cache.]

3. EFS best practices


When we think about managing EFS, what are some of the best practices? Now when we
think about managing EFS across our networks, right, because right now if you have not done
anything to manage it, every user in the place can lock up their files and if their local certificate
is destroyed or corrupt, you will never get those files back. So what you want to do is figure out
a way to have multiple recovery agents and to store the key securely. How you are going to do
that? You are going to do it with Group Policy. You are going to do it with Group Policy, you
are doing to define recovery agents for the network, you are going to store those keys up in
Active Directory. That is the way to manage EFS.

The other piece of this is to educate users. Educate them not just to encrypt documents but to
encrypt the folder itself, right. Guys, you know why? Because every time you open-up a file, it
creates a temporary copy of the file. If I encrypt the file but not the folder it is in, the temp file is
plain text. Anybody that has crack their way on to my box or can read those areas of system
memory or the disk that the temp files written to, can get at the data that is in that file. So we
encrypt the folder, not just the file.

This is a great chart guys that gives me, kind of, an overview of the three encryption
methodologies we are talking about. EFS, BitLocker drive encryption – that is BDE, and
BitLocker To Go – that is B2G, as it appears in the chart. And what we see here is that EFS is
user-based. Only the user that encrypts the files has access to them. While BitLocker is drive-
based, anybody that has the rights to log into that machine or has a PIN to get out of that
machine...it is going to decrypt the volume and it is going to work for that user...no relationship
with the user.

Is there any special hardware required? Not really, right? With EFS, I have got to have an
NTFS partition. With BitLocker drive encryption, I would prefer to have a TPM chip but not
necessarily support for disk encryption. That is the function of BitLocker To Go and BitLocker
drive encryption, right. EFS is file-level encryption. Group Policies are available to manage all
of these methodologies, and recovery keys critical to being able to recover from a loss of the
principal's certificate that encrypted the file or the keys that were used to encrypt in a BitLocker
scenario. Having those recovery keys and agents available makes the encryption recoverable
in the event of a failure.

[The table displaying the features of encryption methodologies consists of three columns –
EFS, BDE, and B2G and seven rows as follows: User-based, Boot Validation, Special
hardware, Disk encryption, File encryption, Group policy available, and Recover key. The User
based, File encryption, Group policies available, and Recovery key features are ticked for EFS.
The Boot validation, Special hardware, Disk encryption, Group policies available, and
Recovery key features are ticked for BDE. The Special hardware, Disk encryption, Group
Policies available, and Recovery key features are ticked For B2G.]
Using Audit Policies in Window Server 2012
R2
Learning Objective
After completing this topic, you should be able to
◾ recognize the advantages of auditing user activity

1. Audit policy overview


We have always been able to audit in Windows Server. And that maybe our ability to audit, our
ability to put in place automated processes that watch what is happening on that box, right? On
my file servers...who is accessing these files? What are they doing? Are they changing these
files? Are they editing these files? Are they deleting files on this file server? I want to know who
is doing what. On my domain controller, who is logging in, who is getting on the network, when
was the last time they logged, right? I want to know these things and I want to know these
things so that I can spot problems, so that when problems happen, we can do forensic analysis
and keep those problems from happening again.

I want to be able to report on-trending data, right? I want to know that we used to have this
problem more often, but I can demonstrate that our remediation means that we have this
problem less and less, right...that we are secure. I know we are secure because I can see it in
the audit logs. That is what I want to know. And so we audit, we set up automated systems to
watch what is happening and then to be able to report on that, so that we know we are in
compliance and we are secure.

2. Demo: Creating audit policies


Here we are in the Group Policy Management Editor, under the Computer Configuration node,
Policies - Windows Settings - Security Settings - Local Policies - Audit Policy. And this is where
we enable auditing in these classic categories. And these should be familiar to most of us.
These have been around, you know, all the way back. This setting – Audit account logon
events, we enable this on our domain controllers because by definition we are auditing
authentication to Active Directory. And where does that happen, on my authentication server,
my domain controllers. As opposed to the auditing of logon events; where do we enable logon
events? We enable these on our servers or on our client machines where local accounts are
being used, may be on kiosk machines, client machines, servers with local user accounts. And
we want to make that distinction between these domain logons and local logons.

[The Group Policy Management editor window is open. The window contains two nodes as
follows: Computer Configuration and User Configuration. The Computer Configuration node
includes subnodes – Policies and Preferences. The Policies subnode, includes Software
Settings, Windows Settings, and Administrative Templates. The Windows setting subnode
includes a list of subnodes. The Security Settings subnode under Windows setting node also
includes a list of subnodes. The Local Policies subnode under Security Settings node includes
Audit Policy, User Rights Assignment, and Security Options. The instructor opens the Audit
Policy and the following policies are displayed: Policy Audit account logon events Audit
account management Audit directory service access Audit logon events Audit object access
Audit policy change Audit privilege use Audit process tracking Audit system events Policy
Setting Not Defined Not Defined Not Defined Not Defined Not Defined Not Defined Not
Defined Not Defined Not Defined The instructor then opens the Audit account logon events
and as a result the Audit account logon events Properties dialog box is displayed. The dialog
box contains two tabs as follows: Security Policy Setting and Explain. The Security Policy
Setting tab is already open. The Security Policy Setting tabbed page includes a Define these
policy settings options and an Audit these attempts section. The Audit these attempts section
includes two options as follows: Success and Failure. The instructor selects the Explain tab
and then clicks Cancel to close the dialog box and navigate to the Audit Policy node. The
instructor clicks the Audit logon events policy and the Audit logon events Properties dialog box
is displayed. The dialog box contains two tabs as follows: Security Policy Setting and Explain.
The Security Policy tab is already open. The Security Policy Setting tabbed page includes a
Define these policy settings option and an Audit these attempts section. The Audit these
attempts section includes two options as follows: Success and Failure. The instructor selects
the Explain tab and then clicks Cancel to close the dialog box and navigate to the Audit Policy
node.]

Then there is directory service access, account management, policy change, and privilege
use. And these all fall broadly into the category of Active Directory Management. Am I
accessing Active Directory objects and changing their attributes? Am I giving you a new
department or a new zip code? Am I creating new objects in Active Directory, new user
accounts or resetting a user password? Am I changing user rights assignment? Am I granting
you the right to run as a member of the operating system? Am I granting you the right to logon
to a server, when maybe your account doesn't have that right built-in?

And then down here, are those privileges being exercised? Are you logging in to that server? If
I want to know that I Audit privilege use. Am I application service…? I Audit process
tracking. And if I want to know if somebody is attempting to turn the machine off and failing, if I
want the event viewer to log an event when the event log size limit is exceeded, you know, the
last thing it does before it shuts down is it logs that event. I enabled the auditing of system
events. And when I enable any of these here, those changes are applied. But there is one
exception to that and that is Audit object access.

[The Audit Policy node is open and the following policies are displayed: Policy Audit account
logon events Audit account management Audit directory service access Audit logon events
Audit object access Audit policy change Audit privilege use Audit process tracking Audit
system events Policy Setting Failure Success, Failure Success, Failure Not Defined Not
Defined Not Defined Not Defined Not Defined Not Defined The instructor opens the Audit
object access policy and the Audit object access Properties dialog box is displayed. The dialog
box contains two tabs as follows: Security Policy Setting and Explain. The Security Policy tab
is already open. The Security Policy Setting tabbed page includes a Define these policy
settings option and an Audit these attempts section. The Audit these attempts section includes
two options as follows: Success and Failure.]

And when we think about the auditing of object access, we principally think about our file
servers. Particularly if I live in a world of compliance or regulation, I think about you guys with
heap of data or Sarbanes-Oxley compliance requirements, any government or institutional firm
is going to have their own set of compliance requirements. And so when I enable object
access on my file servers; I can then track who reads these documents, who writes to these
documents, who modifies these documents, who deletes subfolders, directories, et cetera;
what they are doing in there. I want to know that.

Now the limitation of this is when I enable it that is not enough. I then have to go to those
sensitive directories or at least I did in the past, dig it, here is the change that came around in
2008. In the past, I always had to go to the directories and then specify the auditing for that
directory; but back in the 2008 time frame we got these advanced audit policy settings. And
down here, there is a choice for Global Object Access Auditing.

[The Audit object access Properties dialog box is open. The instructor selects the Explain tab
and then clicks Cancel to close the dialog box and navigate to the Audit Policy node. Next the
instructor clicks the Advanced Audit Policy node. Then the instructor opens the Global Object
Access Auditing option from the Audit Policies node.]

Today I can enable object access, and then define policy settings not based on particular
directories but based on who is accessing those directories. So I think about my R and D
people. These guys have a billion dollars worth of research data on their machines, on these
file servers. It is really important to me that I know who is writing to these files. If anybody
attempts to take ownership of these files, if anybody attempts to change permissions on these
files, or delete these files or subfolders, I need to know that I want to have a history of all those
changes so that if there is any problem we know what happened, right, we can try to figure out
what happened. And that is global object accessing that is a great thing.

[The Advanced Audit Policy node of the Group Policy Management Editor window is open. The
instructor opens the Global Object Access Auditing option from the Audit Policies node. As a
result, the following information is displayed in the view pane: Resource Manger File system
Registry Audit Events Not configured Not configured Next the instructor opens the File system
and the File system Properties dialog box is displayed. The dialog box contains two tabs as
follows: Policy and Explain. The Policy tab is already open. The instructor then selects the
Define the policy setting option in the Policy tabbed page. Next the instructor clicks the
Configure button and as a result the Advanced Security Settings for Global File SACL window
is displayed. The window includes an Auditing tab and the instructor clicks the Add button in
the tabbed page. As a result, the page to Select a principal it displayed. Then the instructor
clicks the Select a principal hyperlink and selects the object named AllRandD and as a result,
all the permissions on the page get enabled. The instructor clicks Clear all button and all the
permissions get unselected. Then the instructor selects the following permissions: Write, Take
ownership, Change permissions, Read permissions, Delete, Delete subfolders and files, Write
extended attributes, Write attributes, Create folders / append data, and Create files / write
data. Next the instructor clicks OK and closes all the windows and dialog box and navigates to
the Group Policy Management editor.]

Now in addition to that, what I also get are all of these sub-categories of audit. There are like
36 new categories in here, over the last couple of iterations the operating system. And if I look
at object access, here is a great one. Right, these guys have access to file servers that have a
billion dollars' worth of research data on them. Now I want to know if anybody is copying that
stuff to their local USB drives, right. I want to audit if people are attempting and/or failing to use
those USB drives. I want to know if data is leaving this place, right.
[The Advanced Audit Policy node of the Group Policy Management Editor window is open. The
instructor opens the Global Object Access Auditing option from the Audit Policies node. As a
result, the following information is displayed in the view pane: Subcategory Audit Application
Generated Audit Certification Services Audit Detailed File Share Audit File Share Audit File
System Audit Filtering Platform Connection Audit Filtering Platform Packet Drop Audit Handle
Manipulation Audit Kernel Object Audit Other Object Access Events Audit Registry Audit
Removable Storage Audit SAM Audit Central Access Policy Staging Audit Events Not
configured Not configured Not configured Not configured Not configured Not configured Not
configured Not configured Not configured Not configured Not configured Not configured Not
configured Not configured Next the instructor opens the Audit Removable Storage and the
Audit Removable Storage Properties dialog box is displayed. The dialog box contains two tabs
as follows: Policy and Explain. The Policy tab is already open. The instructor selects the
Configure the following audit events and selects both the options: Success and Failure under
it. Then the instructor clicks OK and the Object Access in Group Policy Management Editor
window is displayed.]

Now in addition to the Graphical User Interface here, we can also come over to the command
prompt and for those of you who are managing server core installations and you want to set
audit policy in here if you want to script your solutions, there is auditpol. And I can see in
here that I have these principal switches, I can backup audit policy from here, restore audit
policy, which means I can script the backup and restore of these solutions, and I can get the
current audit policy. To do that, right, let's take a look here, right. I need to specify the user,
here is a great command line – auditpol /get and then I specify the user and domain. And
then...the Category is "Detailed Tracking" , "Object Access". I want to know every file that that
user has accessed, whether they have written to it, changed it, modified it, et cetera. So take a
look at auditpol, if you are a fella who enjoys working in command line.

[The Object Access node of the Group Policy Management Editor window is open. The
instructor opens the command prompt and executes the auditpol /? Command and the
following output is displayed: Commands <only one command permitted per execution> /?
Help <context-sensitive> /get Displays the current audit policy. /set Sets the audit policy. /list
Displays selectable policy elements. /backup Saves the audit policy to a file. /restore Restores
the audit policy. /clear Clears the audit policy. /remove Removes the per-user audit policy for a
user account. /resource SACL Configure global resource SACLs. The instructor then executes
the auditpol / get /? Command and the following output is displayed in the partially displayed
command prompt: /? Help <context-sensitive> /user The security principal for whom the per-
user audit policy is queried. Either the /category or /subcategory option must be specified. The
user may be specified as a SID or name. If no user account is specified, then the system audit
policy is queried. / category One or more audit categories specified by GUID or name. An
asterisk <"*"> may be used to imdicate that all audit categories should be queried.
/subcategory One or more audit subcategories specified by GUID or name. /sd Retrieves the
security descriptor used to delegate access to the audit policy. /option Retrieve existing policy
for CrashOnAuditFail, FullPrivilegeAuditing, AuditBaseObjects or AuditBaseDirectories. /r
Display the output in report <CSV> format. Sample usage:
auditpol/get/user:domain\user/Category:”Detailed Tracking”, "Object Access"
auditpol/get/Subcategory :<0cce9212-69ae-11d9-bed3-505054503030> /r
auditpol/get/options:CrashOn AuditFail auditpol/get/user:{S-1-5-21-397123417-1234567}
/Category:"System" auditpol/get/sd]
Implementing DFS, FSRM, Encryption, and
Auditing
Learning Objective
After completing this topic, you should be able to
◾ configure DFS, FSRM, BitLocker, EFS, and Auditing

1. Using DFS and FSRM


Now that we have discussed how to configure and secure file services in Windows Server
2012 R2, let's try this exercise.

You are in the process of managing file servers and their resources to ensure availability and
security for users.

You want to make use of the Distributed File System solution, File Server Resource Manager,

encryption, and audit policies in Windows Server 2012 R2 to aid in your file server data
management.

Question

You are creating a Distributed File System, or DFS, replication group to simplify
administration. You have gone into the DFS Management console and launched the
New Replication Group wizard and also selected the type of group to create; in this
case a multipurpose group. Put the remaining steps for creating a replication group
into the correct order.

Options:

A. Uniquely name the group and add servers


B. Select the topology and allocate bandwidth to the replication group
C. Specify the primary member for the replication group
D. Set the local path on the groups member servers for replication folders
E. Create the replication group

Answer

Correct answer(s):

Uniquely name the group and add servers is ranked


The first step is to give the replication group a name; this must be unique in the
domain that hosts it. You then add in the servers which will participate in the
replication group.
Select the topology and allocate bandwidth to the replication group is
ranked
The second step is to choose a topology and select how much bandwidth to
allocate for replication. You can also set the schedule here.
Specify the primary member for the replication group is ranked
The third step is to specify the primary member for the replication group that
contains the content you need to replicate to the other group members.
Set the local path on the groups member servers for replication folders
is ranked
The fourth step is to set the local path on the group's member servers for the
folders you want to replicate.
Create the replication group is ranked
The final step is to get the option to review your selections and create the
replication group.

Question

You are working with File Service Resource Manager, or FRSM, and want all the
filenames containing a specific word to be marked as confidential. You also want to
limit the amount of disk space a department can use on a specific file share. What
needs to be created in order to accomplish these tasks?

Options:

1. Classification rule
2. Quotas
3. File screen
4. Replication group

Answer

Option 1: Correct. With FRSM you have the ability to classify data according to your
organization's standards. FRSM allows administrators to automatically assign
classification information to files on file servers and apply policy to them based on
that information.
Option 2: Correct. Using quotas, FSRM allows you to monitor and limit the space
users can consume. It also allows you to limit a volume by setting a desired
threshold.

Option 3: Incorrect. File screens are used to control the type of files that users can
save. File screens will also send notifications when users attempt to save files which
breach the screening policy.

Option 4: Incorrect. Distributed File System, or DFS, replication groups are used to
simplify administration and management of shared folders by keeping your folder
structures synchronized across multiple nodes. This feature allows for simplified
migration of data and provides a level of redundancy.

Correct answer(s):

1. Classification rule
2. Quotas

2. Encryption services and audit policies

Question

You want to create a data recovery agent, or DRA, to allow your credentials to unlock
BitLocker-protected drives and recover encrypted data. You have already configured
the BitLocker identification field and identified the data recovery agents in the Public
Key Policies Group Policy settings for BitLocker Drive Encryption.

Options:

1. BitLocker Drive Encryption


2. Encrypting File System
3. BitLocker Drive Encryption Network Unlock Certificate

Answer

Option 1: To configure a recovery agent for a selected removable drive goes to


Computer Configuration\Windows Settings\Security Settings\Public Key Policies,
right-click BitLocker Drive Encryption. Click Add Data Recovery Agent to start the
Add Recovery Agent Wizard.

Option 2: This option is selected when you want to define a recovery agent for the
Encryption File System policy.
Option 3: This option is selected when you want to deploy the public key certificate
to computers to be able to unlock using the Network Unlock key.

Correct answer(s):

1. BitLocker Drive Encryption

Question

Removable storage can be a security risk for an enterprise and to help mitigate
against this type of security risk you are going to track attempts to use removable
drives.

Options:

1. Object Access
2. System
3. Privilege Use

Answer

Option 1: You can choose to audit and record successful attempts to write to or read
from a removable device and set failure audits to record unsuccessful attempts to
access removable storage on removable storage devices such as a USB. This setting
is configured via GPO by accessing Computer Configuration\Policies\Windows
Settings\Security Settings\Advanced Audit Policy Configuration\Object Access and
then enabling the Audit Removable Storage setting.

Option 2: You can use the settings under System to configure auditing for system
objects, such as using IPSec services, when system events get logged, and any
events that could affect the systems sub-system.

Option 3: You can use the settings under Privilege Use to configure auditing on
privileges, which are sensitive and non-sensitive.

Correct answer(s):

1. Object Access

© 2018 Skillsoft Ireland Limited

You might also like