Palo Alto Networks Cybersecurity Academy

Download as pdf or txt
Download as pdf or txt
You are on page 1of 5

Palo Alto Networks Cybersecurity Academy

Cloud Security
Data centers have rapidly evolved from a traditional, closed environment with static, hardware-based
computing resources to one in which traditional and cloud computing technologies are mixed (see Figure 1-8).

Figure 1-8: Data centers are evolving to include a mix of hardware and cloud computing technologies.

The benefit of moving toward a cloud computing model – private, public, or hybrid – is that it improves
operational efficiencies and lowers capital expenditures:

• Optimizes existing hardware resources. Instead of using a “one server, one application” model,
you can run multiple virtual applications on a single physical server, which means that
organizations can leverage their existing hardware infrastructure by running more applications
within the same system, provided there are sufficient compute and memory resources on the
system.

• Reduces data center costs. Reduction of the server hardware “box” count not only reduces the
physical infrastructure real estate but also reduces data center costs for power, cooling, and rack
space, among others.

• Increases operational flexibility. Through the dynamic nature of virtual machine (VM) provisioning,
applications can be delivered more quickly than they can through the traditional method of
purchasing them, “racking/stacking,” cabling, and so on. This flexibility helps improve the agility of
the IT organization.

© 2020 Palo Alto Networks Cybersecurity Academy http://paloaltonetworksacademy.net


• Maximizes efficiency of data center resources. Because applications can experience asynchronous
or burst demand loads, virtualization provides a more efficient way to address resource contention
issues and maximize server use. It also provides a better way to deal with server maintenance and
backup challenges. For example, IT staff can migrate VMs to other virtualized servers or hypervisors
while performing hardware or software upgrades.

Cloud computing depends on virtualization

Cloud computing is not a location, but rather a pool of resources that can be rapidly provisioned in an
automated, on-demand manner. The U.S. National Institute of Standards and Technology (NIST) defines cloud
computing in Special Publication (SP) 800-145 as “a model for enabling ubiquitous, convenient, on-demand
network access to a shared pool of configurable computing resources (such as networks, servers, storage,
applications, and services) that can be rapidly provisioned and released with minimal management effort or
service provider interaction.”

The value of cloud computing is the ability to pool resources together to achieve economies of scale and
agility. This is true for private or public clouds. Instead of having many independent and often under-used
servers deployed for your enterprise applications, pools of resources are aggregated, consolidated, and
designed to be elastic enough to scale with the needs of your organization.

The move toward cloud computing not only brings cost and operational benefits but also technology benefits.
Data and applications are easily accessed by users no matter where they reside, projects can scale easily, and
consumption can be tracked effectively. Virtualization is a critical part of a cloud computing architecture that,
when combined with software orchestration and management tools, allows you to integrate disparate
processes so that they can be automated, easily replicated, and offered on an as-needed basis.

Cloud computing security considerations and requirements

With the use of cloud computing technologies, your data center environment can evolve from a fixed
environment where applications run on dedicated servers toward an environment that is dynamic and
automated, where pools of computing resources are available to support application workloads that can be
accessed anywhere, anytime, from any device.

Security remains a significant challenge when you embrace this new dynamic, cloud-computing fabric
environment. Many of the principles that make cloud computing attractive are counter to network security
best practices:

• Cloud computing doesn’t mitigate existing network security risks. The security risks that threaten
your network today do not change when you move to the cloud. The shared responsibility model
defines who (customer and/or provider) is responsible for what (related to security) in the public
cloud. In general terms, the cloud provider is responsible for security of the cloud, including the
physical security of the cloud data centers, and foundational networking, storage, compute, and

© 2020 Palo Alto Networks Cybersecurity Academy http://paloaltonetworksacademy.net


virtualization services. The cloud customer is responsible for security in the cloud, which is further
delineated by the cloud service model. For example, in an infrastructure-as-a-service (IaaS) model, the
cloud customer is responsible for the security of the operating systems, middleware, runtime,
applications, and data. In a platform-as-a-service (PaaS) model, the cloud customer is responsible for
the security of the applications and data and the cloud provider is responsible for the security of the
operating systems, middleware, and run time. In a SaaS model, the cloud customer is responsible only
for the security of the data and the cloud provider is responsible for the full stack from the physical
security of the cloud data centers to the application.

• Security requires isolation and segmentation; the cloud relies on shared resources. Security best
practices dictate that mission-critical applications and data be isolated in secure segments on the
network using the Zero Trust principle of “never trust, always verify.” On a physical network, Zero Trust
is relatively straightforward to accomplish using firewalls and policies based on application and user
identity. In a cloud computing environment, direct communication between VMs within a server and in
the data center (east-west traffic occurs constantly, in some cases across varied levels of trust, making
segmentation a difficult task. Mixed levels of trust, when combined with a lack of intra-host traffic
visibility by virtualized port-based security offerings, may weaken an organization’s security posture.

• Security deployments are process-oriented; cloud computing environments are dynamic. The
creation or modification of your cloud workloads can often be done in minutes, yet the security
configuration for this workload may take hours, days, or weeks. Security delays are not purposeful;
they’re the result of a process that is designed to maintain a strong security posture. Policy changes
need to be approved, the appropriate firewalls need to be identified, and the relevant policy updates
need to be determined. In contrast, the cloud is a highly dynamic environment, with workloads (and IP
addresses) constantly being added, removed, and changed. The result is a disconnect between security
policy and cloud workload deployments that leads to a weakened security posture. Security
technologies and processes must leverage capabilities such as cloning and scripted deployments to
automatically scale and take advantage of the elasticity of the cloud while maintaining a strong security
posture.

• Multi-tenancy is a key characteristic of the public cloud – and a key risk. Although public cloud
providers strive to ensure isolation between their various customers, the infrastructure and resources
in the public cloud are shared. Inherent risks in a shared environment include misconfigurations,
inadequate or ineffective processes and controls, and the “noisy neighbor” problem (excessive
network traffic, disk I/Os, or processor utilization can negatively impact other customers sharing the
same resource). In hybrid and multi-cloud environments that connect numerous public and/or private
clouds, the lines become still more blurred, complexity increases, and security risks become more
challenging to address.

© 2020 Palo Alto Networks Cybersecurity Academy http://paloaltonetworksacademy.net


As organizations transition from a traditional data center architecture to a public, private, or hybrid cloud
environment, enterprise security strategies must be adapted to support changing requirements in the cloud.
Key requirements for securing the cloud include:

• Consistent security in physical and virtualized form factors. The same levels of application control and
threat prevention should be used to protect both your cloud computing environment and your physical
network. First, you need to be able to confirm the identity of your applications, validating their identity
and forcing them to use only their standard ports. You also need to be able to block the use of rogue
applications while simultaneously looking for and blocking misconfigured applications. Finally,
application-specific threat prevention policies should be applied to block both known and unknown
malware from moving into and across your network and cloud environment.

• Segment your business applications using Zero Trust principles. To fully maximize the use of
computing resources, a relatively common current practice is to mix application workload trust levels
on the same compute resource. Although they are efficient in practice, mixed levels of trust introduce
new security risks in the event of a compromise. Your cloud security solution needs to be able to
implement security policies based on the concept of Zero Trust as a means of controlling traffic
between workloads while preventing lateral movement of threats.

• Centrally manage security deployments; streamline policy updates. Physical network security is still
deployed in almost every organization, so it’s critical to have the ability to manage both hardware and
virtual form factor deployments from a centralized location using the same management infrastructure
and interface. To ensure that security keeps pace with the speed of change that your workflows may
exhibit, your security solution should include features that will allow you to lessen or eliminate the
manual processes that security policy updates often require.

Existing data security solution weaknesses

Existing data center security solutions exhibit the same weaknesses found when they are deployed at a
perimeter gateway on the physical network – they make their initial positive control network access decisions
based on port, using stateful inspection, then they make a series of sequential, negative control decisions
using bolted-on feature sets. There are several problems with this approach:

• “Ports first” limits visibility and control. Their focus on “ports first” limits their ability to see all
traffic on all ports, which means that evasive or encrypted applications, and any corresponding
threats that may or may not use standard ports can slip through undetected. For example, many
data center applications such as Microsoft® Lync®, Active Directory®, and SharePoint® use a
wide range of contiguous ports to function properly. This means you need to open all those
ports first, exposing those same ports to other applications or cyber threats.

© 2020 Palo Alto Networks Cybersecurity Academy http://paloaltonetworksacademy.net


• They lack any concept of unknown traffic. Unknown traffic epitomizes the 80/20 rule – it is high
risk, but represents only a relatively small amount of traffic on every network. Unknown traffic
can be a custom application, an unidentified commercial off-the-shelf application, or a threat.
The common practice of blocking it all may cripple your business. Allowing it all is highly risky.
You need to be able to systematically manage unknown traffic using native policy management
tools to reduce your organizational security risks.

• Multiple policies, no policy reconciliation tools. Their sequential traffic analysis (stateful
inspection, application control, IPS, anti-malware, etc.) requires a corresponding security policy
or profile, often using multiple management tools. The result is that your security policies
become convoluted as you build and manage a firewall policy with source, destination, user,
port, and action; an application control policy with similar rules; and any other threat prevention
rules required. Multiple security policies that mix positive (firewall) and negative (application
control, IPS, anti-malware) control models can cause security holes by missing traffic and/or not
identifying the traffic. This situation is made worse when there are no policy reconciliation tools.

• Cumbersome security policy update process. Existing security solutions in the data center do
not address the dynamic nature of your cloud environment because your policies have difficulty
contending with the numerous dynamic changes that are common in virtual data centers. In a
virtual data center, VM application servers often move from one physical host to another, so
your security policies must adapt to changing network conditions.

Many cloud security offerings are merely virtualized versions of port and protocol-based security appliances,
offering the same inadequacies as their physical counterparts.

© 2020 Palo Alto Networks Cybersecurity Academy http://paloaltonetworksacademy.net

You might also like