0% found this document useful (0 votes)
30 views26 pages

Unit-4 CC

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views26 pages

Unit-4 CC

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

UNIT-4

TOPIC 1: What is Cloud Security?

Cloud security refers to protecting data stored online via cloud computing environments
(instead of data centers) from theft, deletion, and leakage. There are many protective
methods that help secure the cloud; these measures include access control, firewalls,
penetration testing, obfuscation, tokenization, virtual private networks (VPN), and not using
public internet connections.

Migrating to Cloud? Here’s 5 Ways to Manage Data Security | A Guide to Cloud Computing

How Secure is the Cloud?

When it comes to network security concerns, the cloud itself is not the issue – rather,
the challenge lies within the policies and technologies for security and control of that
technology. Put simply? Human error is one of the top reasons for data breaches in the
cloud. In fact, Gartner estimates that by 2022, at least 95 percent of cloud security failures
will be the customer’s fault due to misconfigurations and mismanagement.

Therefore, it is not an issue of whether or not the cloud is secure but if the customer is using
the cloud securely.

Cyber security trends are continuing into 2021. Learn more here!

Examples of Cloud Security Compromised by Misconfiguration

Too often, misconfigured cloud-based systems lead to data breaches. For instance, in
2019, Capital One was hacked by a malicious actor who stole the sensitive data of more
than 100 million people while not following traditional hacker patterns.

The breach was the result of a misconfigured open-source web application firewall (WAF),
which Capital One used in its operations hosted on Amazon Web Services. The
misconfigured WAF was permitted to list all the files in any AWS data buckets and read the
contents of each file. The misconfiguration allowed the intruder to trick the firewall into
relaying requests to a key back-end resource on AWS.

Once the breach happened, 100 million U.S. citizens were impacted and 140,000 Social
Security numbers and 80,000 bank account numbers were compromised. In total, the
breach cost Capital One roughly $150 million.

Anticipated Cloud Security Challenges in 2021

From April to May 2020, the Cloud Security Alliance (CSA) conducted a survey of
experienced cloud security architects, designers, and operators from large organizations to,
in part, determine the challenges of public cloud workloads in 2020. After surveying 200
respondents, they found that anticipated security challenges included:

 Visibility
 Data Privacy
 IAM Procedures
 Configuration Management
 Compliance Requirements
At the same time, the diversity of production workloads in the public cloud were also
expected to increase in 2021, including the use of container platforms, function-as-a-
service/serverless approach, and cloud provider services. Use of virtual machines is also
expected to increase.

7 Fundamentals of Cloud Security

Don’t just migrate to the cloud – prevent security threats by following these tips:

1. Understand what you’re responsible for – different cloud services require varying
levels of responsibility. For instance, while software-as-a-service (SaaS) providers ensure
that applications are protected and that data security is guaranteed, IaaS environments may
not have the same controls. To ensure security, cloud customers need to double check with
their IaaS providers to understand who’s in charge of each security control.

2. Control user access – a huge challenge for enterprises has been controlling who has
access to their cloud services. Too often, organizations accidently publically expose their
cloud storage service despite warnings from cloud providers to avoid allowing storage drive
contents to be accessible to anyone with an internet connection. CSO advises that only load
balancers and bastion hosts should be exposed to the internet. Further, do not allow Secure
Shell (SSH) connections directly from the internet as this will allow anyone who finds the
server location to bypass the firewall and directly access the data. Instead, use your cloud
provider’s identity and access control tools while also knowing who has access to what data
and when. Identity and access control policies should grant the minimum set of privileges
needed and only grant other permissions as needed. Configure security groups to have the
narrowest focus possible and where possible, use reference security group IDs. Finally,
consider tools that let you set access controls based on user activity data.

3. Data protection – data stored on cloud infrastructures should never be unencrypted.


Therefore, maintain control of encryption keys where possible. Even though you can hand
the keys over to cloud service providers, it is still your responsibility to protect your data. By
encrypting your data, you ensure that if a security configuration fails and exposes your data
to an unauthorized party, it cannot be used.

4. Secure credentials – AWS access keys can be exposed on public websites, source code
repositories, unprotected Kubernetes dashboards, and other such platforms. Therefore, you
should create and regularly rotate keys for each external service while also restricting access
on the basis of IAM roles. Never use root user accounts – these accounts should only be
used for specific account and service management tasks. Further, disable any user accounts
that aren’t being used to further limit potential paths that hackers can compromise.

5. Implement MFA – your security controls should be so rigorous that if one control fails,
other features keep the application, network, and data in the cloud safe. By tying MFA
(multi-factor authentication) to usernames and passwords, attackers have an even harder
time breaking in. Use MFA to limit access to management consoles, dashboards, and
privileged accounts.

6. Increase visibility – to see issues like unauthorized access attempts, turn on security
logging and monitoring once your cloud has been set up. Major cloud providers supply some
level of logging tools that can be used for change tracking, resource management, security
analysis, and compliance audits.

7. Adopt a shift–left approach – with a shift-left approach, security considerations are


incorporated early into the development process rather than at the final stage. Before an
IaaS platform goes live, enterprises need to check all the code going into the platform while
also auditing and catching potential misconfigurations before they happen. One tip –
automate the auditing and correction process by choosing security solutions that integrate
with Jenkins, Kubernetes, and others. Just remember to check that workloads are compliant
before they’re put into production. Continuously monitoring your cloud environment is key
here.

TPOIC 2: Vulnerability Assessment

Vulnerability Assessment is a process of evaluating security risks in software systems to


reduce the probability of threats. The purpose of vulnerability testing is to reduce intruders
or hackers' possibility of getting unauthorized access to systems.

The vulnerability is any mistake or weakness in the system's security procedures, design,
implementation, or internal control that may violate the system's security policy.

A vulnerability assessment process may involve automated and manual techniques with
varying degrees of rigor and an emphasis on comprehensive coverage. Using a risk-based
approach, vulnerability assessments may target different technology layers, the most
common being host, network, and application-layer assessments.

Vulnerability assessments provide security teams and other stakeholders with the
information they need to analyze and prioritize potential remediation risks in the proper
context. Vulnerability assessments are a critical component of the vulnerability
management and IT risk management lifecycles, helping protect systems and data from
unauthorized access and data breaches.

Organizations of any size, or even individuals who face an increased risk of cyberattacks, can
benefit from some form of vulnerability assessment. Still, large enterprises and other
organizations subject to ongoing attacks will benefit most from vulnerability analysis.
Because security vulnerabilities can enable hackers to access IT systems and applications,
enterprises need to identify and remediate weaknesses before being exploited.

A comprehensive vulnerability assessment, along with a management program, can help


companies improve the security of their systems.

Types of Vulnerability Assessments

Vulnerability assessment applies various methods, tools, and scanners to determine grey
areas, threats, and risks. Everything depends on how well the given systems' weakness is
discovered to attend to that specific need. Below are the different types of vulnerability
assessment, such as:

1. Network-based scans

It helps identify possible network security attacks. The scan helps zero-in the vulnerable
systems on wired or wireless networks.
2. Host-based scans

Host-based scans are used to locate and identify vulnerabilities in servers, workstations or
other network hosts. This type of scan usually examines ports and services that may also be
visible to network-based scans. It also provides excellent visibility into the configuration
settings and patch history of scanned systems.

3. Wireless network scans

Wireless network infrastructure is scanned to identify vulnerabilities. It helps in validating a


company's network.

4. Application Scans

It is used to test websites to discover all known software vulnerabilities. It also identifies
security vulnerabilities in web applications and their source code by automated scans on the
front-end or static or dynamic source code analysis.

5. Database Scans

Database Scans aid in identifying grey areas in a database to prevent vicious attacks by
cybercriminals. It is identifying rogue databases or insecure environments and classifying
sensitive data across an organization's infrastructure.

Vulnerability Assessments Benefits

Vulnerability assessments allow security teams to apply a consistent, comprehensive, and


clear approach to identifying and resolving security threats and risks. This has several
benefits to an organization, such as:

o Early and consistent identification of threats and weaknesses in IT security.

o Remediation actions to close any gaps and protect sensitive systems and
information.

o Meet cybersecurity compliance and regulatory needs for areas like HIPAA and PCI
DSS.

o Protect against data breaches and other unauthorized access.

o A vulnerability assessment provides an organization with information on the security


weaknesses in its environment.
o It provides direction on how to assess the risks associated with those weaknesses.
This process offers the organization a better understanding of its assets, security
flaws and overall risk.

o The process of locating and reporting the vulnerabilities provides a way to detect
and resolve security problems by ranking the vulnerabilities before someone or
something can exploit them.

o In this process, Operating systems, Application Software and Network are scanned to
identify vulnerabilities, including inappropriate software design, insecure
authentication, etc.

Vulnerability Assessment Process

Below is the step by step vulnerability assessment process to identify the system
vulnerability.

1. Goals and Objective: Define the goals and objectives of Vulnerability Analysis.

2. Scope: While performing the Assessment and Test, the assignment's Scope needs to
be clearly defined. The following are the three possible scopes that exist, such as:

o Black Box Testing

:It is a software testing method in which software applications' functionalities are tested
without knowing internal code structure, implementation details and internal paths.
Black Box Testing mainly focuses on the input and output of software applications, and it is
entirely based on software requirements and specifications. It is also known as Behavioral
Testing.

o White Box Testing

: White box testing is a software testing technique in which internal structure, design and
coding of software are tested to verify the flow of input-output and also improve design,
usability and security.
In white-box testing, code is visible to testers, so it is also called Clear box testing, Open box
testing, transparent box testing, Code-based testing and Glass box testing.

o Grey Box Testing

:It is a software testing technique to test a software product or application with partial
knowledge of its internal structure. The purpose of grey box testing is to search and identify
the defects due to improper code structure or improper applications.
In this process, context-specific errors that are related to web systems are commonly
identified. It increases the testing coverage by concentrating on all of the layers of any
complex system.
Grey box testing is the combination of both Black Box Testing and White Box Testing.

3. Information Gathering: Obtaining as much information about the IT environment,


such as Networks, IP Address, Operating System Version, etc. It applies to all the
three types of Scopes, such as Black Box Testing, White Box Testing, and Grey Box
Testing.

4. Vulnerability Detection: In this step, vulnerability scanners scan the IT environment


and identify the vulnerabilities.

5. Information Analysis and Planning: It will analyze the identified vulnerabilities to


devise a plan for penetrating the network and systems.

How to do Vulnerability Assessment

Following is the steps to do a Vulnerability Assessment, such as:


Step 1) Setup: We need to start by determining which systems and networks will be
assessed, identifying where any sensitive data resides, and which data and systems are most
critical. Configure and update the tools.

Step 2) Test Execution: A packet is the data routed unit between an origin and the
destination. When any file, such as an e-mail message, HTML

file, Uniform Resource Locator (URL) request is sent from one place to another on the
internet, the TCP

layer of TCP/IP

divides the file into several "chunks" for efficient routing. Each of these chunks will be
uniquely numbered and will include the Internet address of the destination. These chunks
are called packets.

o Run the captured data packet.

o When all the packets have arrived, they will be reassembled into the original file by
the TCP layer at the receiving end while running the assessment tools.

Step 3) Vulnerability Analysis: Now define and classify network or System resources and
assign priority to the resources (low, medium, high). Identify potential threats to each
resource and develop a strategy to deal with the most prioritized problems. Define and
implement ways to minimize the consequences if an attack occurs.

Step 4) Remediation: The vulnerability assessment results to patch key flaws or problems,
whether simply via a product update or through something more involved, from installing
new security tools to an enhancement of security procedures. In step 3, we prioritized the
problems that ensure the most urgent flaws are handled first. It's also worth noting that
some problems may have so little impact that they may not be worth the cost and
downtime required for remediation.

Step 5) Repeat: Vulnerability assessments need to be conducted regularly, monthly or


weekly, as any single assessment is only a report of that moment in time. These reports give
a strong sense of how security posture has developed.

Vulnerability Testing Methods

Here are the following vulnerability testing methods, such as:

1. Active Testing: Inactive Testing, a tester introduces new test data and analyzes the
results. During the testing process, the testers create a mental model of the process,
and it will grow further during the interaction with the software under test.
While doing the test, the tester will actively find out the new test cases and new
ideas. That's why it is called Active Testing.

2. Passive Testing: It is used to monitoring the result of running software under test
without introducing new test cases or data

3. Network Testing: Network Testing is the process of measuring and recording the
current state of network operation over some time.
Testing is mainly done for predicting the network operating under load or find out
the problems created by new services. We need to Test the following Network
Characteristics, such as:

o Utilization levels

o Number of Users

o Application Utilization

4. Distributed Testing: Distributed Tests are applied for testing distributed applications.
These applications are working with multiple clients simultaneously. Testing a
distributed application means testing its client and server parts separately, but by
using a distributed testing method, we can test them all together.
The test parts will interact with each other during the Test Run.
This makes them synchronized properly. Synchronization is one of the most crucial
points in distributed testing.

TOPIC 3: Cloud Computing Security Architecture

Security in cloud computing is a major concern. Proxy and brokerage services should be
employed to restrict a client from accessing the shared data directly. Data in the cloud
should be stored in encrypted form.

Security Planning

Before deploying a particular resource to the cloud, one should need to analyze several
aspects of the resource, such as:

o A select resource needs to move to the cloud and analyze its sensitivity to risk.

o Consider cloud service models such as IaaS, PaaS,and These models require the
customer to be responsible for Security at different service levels.

o Consider the cloud type, such as public, private, community, or

o Understand the cloud service provider's system regarding data storage and its
transfer into and out of the cloud.

o The risk in cloud deployment mainly depends upon the service models and cloud
types.

Understanding Security of Cloud

Security Boundaries

The Cloud Security Alliance (CSA) stack model defines the boundaries between each service
model and shows how different functional units relate. A particular service model defines
the boundary between the service provider's responsibilities and the customer. The
following diagram shows the CSA stack model:
Key Points to CSA Model

o IaaS is the most basic level of service, with PaaS and SaaS next two above levels of
services.

o Moving upwards, each service inherits the capabilities and security concerns of the
model beneath.

o IaaS provides the infrastructure, PaaS provides the platform development


environment, and SaaS provides the operating environment.

o IaaS has the lowest integrated functionality and security level, while SaaS has the
highest.

o This model describes the security boundaries at which cloud service providers'
responsibilities end and customers' responsibilities begin.

o Any protection mechanism below the security limit must be built into the system and
maintained by the customer.

Although each service model has a security mechanism, security requirements also depend
on where these services are located, private, public, hybrid, or community cloud.

Understanding data security


Since all data is transferred using the Internet, data security in the cloud is a major concern.
Here are the key mechanisms to protect the data.

o access control

o audit trail

o certification

o authority

The service model should include security mechanisms working in all of the above areas.

Separate access to data

Since the data stored in the cloud can be accessed from anywhere, we need to have a
mechanism to isolate the data and protect it from the client's direct access.

Broker cloud storage is a way of separating storage in the Access Cloud. In this approach,
two services are created:

1. A broker has full access to the storage but does not have access to the client.

2. A proxy does not have access to storage but has access to both the client and the
broker.

3. Working on a Brocade cloud storage access system

4. When the client issues a request to access data:

5. The client data request goes to the external service interface of the proxy.

6. The proxy forwards the request to the broker.

7. The broker requests the data from the cloud storage system.

8. The cloud storage system returns the data to the broker.

9. The broker returns the data to the proxy.

10. Finally, the proxy sends the data to the client.


All the above steps are shown in the following diagram:

Encoding

Encryption helps to protect the data from being hacked. It protects the data being
transferred and the data stored in the cloud. Although encryption helps protect data from
unauthorized access, it does not prevent data loss.

Why is cloud security architecture important?

The difference between "cloud security" and "cloud security architecture" is that the former
is built from problem-specific measures while the latter is built from threats. A cloud
security architecture can reduce or eliminate the holes in Security that point-of-solution
approaches are almost certainly about to leave.

It does this by building down - defining threats starting with the users, moving to the cloud
environment and service provider, and then to the applications. Cloud security architectures
can also reduce redundancy in security measures, which will contribute to threat mitigation
and increase both capital and operating costs.

The cloud security architecture also organizes security measures, making them more
consistent and easier to implement, particularly during cloud deployments and
redeployments. Security is often destroyed because it is illogical or complex, and these flaws
can be identified with the proper cloud security architecture.
Elements of cloud security architecture

The best way to approach cloud security architecture is to start with a description of the
goals. The architecture has to address three things: an attack surface represented by
external access interfaces, a protected asset set that represents the information being
protected, and vectors designed to perform indirect attacks anywhere, including in the
cloud and attacks the system.

The goal of the cloud security architecture is accomplished through a series of functional
elements. These elements are often considered separately rather than part of a coordinated
architectural plan. It includes access security or access control, network security, application
security, contractual Security, and monitoring, sometimes called service security. Finally,
there is data protection, which are measures implemented at the protected-asset level.

A complete cloud security architecture addresses the goals by unifying the functional
elements.

Cloud security architecture and shared responsibility model

The security and security architectures for the cloud are not single-player processes. Most
enterprises will keep a large portion of their IT workflow within their data centers, local
networks, and VPNs. The cloud adds additional players, so the cloud security architecture
should be part of a broader shared responsibility model.

A shared responsibility model is an architecture diagram and a contract form. It exists


formally between a cloud user and each cloud provider and network service provider if they
are contracted separately.

Each will divide the components of a cloud application into layers, with the top layer being
the responsibility of the customer and the lower layer being the responsibility of the cloud
provider. Each separate function or component of the application is mapped to the
appropriate layer depending on who provides it. The contract form then describes how each
party responds.

TOPIC 3: Identity management and access control in cloud


computing

The concept of Identity management and access control in cloud computing covers most
areas of technology, access control is merging and aligning with other combined activities.
Some of these are automated using single sign-on capabilities; others operate in a
standalone, segregated fashion.

The combination of access control and effective management of those technologies,


processes, and controls has given rise to identity and access management (IAM). In a
nutshell,

IAM includes people, processes, and systems that manage access to enterprise resources.

This is achieved by ensuring that the identity of an entity is verified (who are they, can they
prove who they are) and then granting the correct level of access based on the assets,
services, and protected resources being accessed.

IAM typically looks to utilize a minimum of two—preferably three or more—factors of


authentication. Within cloud environments, services should include strong authentication
mechanisms for validating users’ identities and credentials .

In line with best practice, one-time passwords should be utilized as a risk reduction and
mitigation technique .

The key phrases that form the basis and foundation for IAM in the enterprise include the
following:

1. Provisioning and de-provisioning

2. Centralized directory services

3. Privileged user management

3. Authentication and access management

Each is discussed in the following sections Identity management and access control in cloud
computing.

1. Provisioning and Deprovisioning

Provisioning and Deprovisioning

Provisioning and de-provisioning are critical aspects of access management and Identity
management and access control in cloud computing. Think of setting up and removing
users.
In the same way, as you would set up an account for a user entering your organization
requiring access to resources, provisioning is the process of creating accounts to allow users
to access appropriate systems and resources within the cloud environment.

The ultimate goal of user provisioning is to standardize, streamline, and create an efficient
account creation process while creating a consistent, measurable, traceable, and auditable
framework for providing access to end-users.

Deprovisioning is the process whereby a user account is disabled when the user no longer
requires access to the cloud-based services and resources.

This is not just limited to a user leaving the organization but may also be due to a user
changing a role, function, or department.

Deprovisioning is a risk-mitigation technique to ensure that authorization creep or


additional and historical privileges are not retained, thus granting access to data, assets, and
resources that are not necessary to fulfill the job role.

2. Centralized Directory Services

Centralized Directory Services

As when building a house or large structure, the foundation is key. In the world of IAM, the
directory service forms the foundation for IAM and security both in an enterprise
environment and within a cloud deployment.

Directory service stores, processes, and facilitates a structured repository of information


stored, coupled with unique identifiers and locations.

The primary protocol for centralized directory services is Lightweight Directory Access
Protocol (LDAP), built and focused on the X.500 standard.16 LDAP works as an application
protocol for querying and modifying items in directory service providers like Active
Directory.

Active Directory is a database-based system that offers authentication, directory, policy, and
other services to a network. Essentially, LDAP acts as a communication protocol to interact
with Active Directory.

LDAP directory servers store their data hierarchically (similar to domain name system [DNS]
trees and UNIX file structures) with a directory record’s distinguished name (DN) read from
the individual entries back through the tree, up to the top level.
Each entry in an LDAP directory server is identified through a DN access to directory
services, should be part of the IAM solution, and should be as robust as the core
authentication modes used.

The use of privileged identity management (PIM) features is strongly encouraged for
managing access of the administrators of the directory.

If these are hosted locally rather than in the cloud, the IAM service requires connectivity to
the local LDAP servers, in addition to any applications and services for which it is managing
access.

Within cloud environments, directory services are heavily utilized and depended upon as
the go-to trusted source by the IAM framework as a secure repository of identity and access
information.

The same can be said for federated environments.

Again, trust and confidence in the accuracy and integrity of the directory services are must-
haves.

3. Privileged User Management

Privileged User Management

As the name implies, privileged user management focuses on the process and ongoing
requirements to manage the lifecycle of user accounts with the highest privileges in a
system.

Privileged accounts typically carry the highest risk and impact because compromised
privileged user accounts can lead to significant permissions and access rights being
obtained, thus allowing the user or attacker to access resources and assets that may
negatively affect the organization.

The key components from a security perspective relating to privileged user management
should, at a minimum, include the ability to track usage, authentication successes and
failures, and authorization times and dates; log successful and failed events; enforce
password management, and contain sufficient levels of auditing and reporting related to
privileged user accounts.

Many organizations monitor this level of information for standard or general users, which
would be beneficial and useful in the event of an investigation; however, the privileged
accounts should capture this level of detail by default because attackers often target and
compromise a general or standard user, with the view to escalating privileges to a more
privileged or admin account.
Not forgetting that a number of these components are technical by nature, the overall
requirements that are used to manage these should be driven by organizational policies and
procedures.

Note that segregation of duties can form an extremely effective mitigation and risk
reduction technique around privileged users and their ability to effect major changes.

4. Authorization and Access Management

Authorization and Access Management

Access to devices, systems, and resources forms a key driver for use of cloud services (broad
network access); without it, the overall benefits that the service may provide are reduced to
the enterprise, and legitimate business or organizational users are isolated from their
resources and assets.

In the same way that users require authorization and access management to be operating
and functioning to access the required resources, security requires these service
components to be functional, operational and trusted to enforce security within cloud
environments. In its simplest form, authorization determines the user’s right to access a
certain resource.

(Think of entry onto a plane with your reserved seat or when you may be visiting an official
residence or government agency to visit a specified person.) Access management is focused
on the manner and way in which users can access relevant resources, based on their
credentials and characteristics of their identity.

Think of a bank or highly secure venue only certain employees or personnel can access the
main safe or highly sensitive areas.

Note that both authorization and access management are point-in-time activities that rely
on the accuracy and ongoing availability of resources and functioning processes, segregation
of duties, privileged user management, password management, and so on, to operate and
provide the desired levels of security.

If one of the mentioned activities is not carried out regularly as part of an ongoing managed
process, it can weaken the overall security posture.

TOPIC 4: What is data at rest?

"Data at rest" is data currently in storage, typically on a computer's or server's hard disk.
Data at rest contrasts with data in transit — also called data in motion — which is the state
of data as it travels from one place to another. It also contrasts with data in use — data
loaded into memory and actively in use by a software program.
Type Where is it?

Data at rest Storage

Data in transit Traveling over networks

Data in use Memory

Suppose Bob wants to send Alice a picture of a cheeseburger. Bob took the picture on his
smartphone, which has stored it ever since — the cheeseburger photo is currently data at
rest. Bob views the photo and attaches it to an email, which loads the photo into memory —
it becomes data in use (specifically by his phone's photo viewer and email applications). Bob
taps "Send," and the email with the attached photo travels over the Internet to Alice's email
service; it has become data in transit.

What dangers does data at rest face?

Each state of data — at rest, in transit, in use — faces the risk of discovery or exposure by a
malicious party. However, the risks are not the same across all of these states. For instance,
data in transit can be intercepted by an unauthorized party, while data at rest cannot,
because it does not move.

Data at rest still makes an attractive target for attackers, who may aim to encrypt the data
and hold it for ransom, steal the data, or corrupt or wipe the data.

No matter the method, the end goal is to access the data at rest and take malicious action,
often with financial gain in mind:

 Ransomware is a type of malware that, once it enters a system, encrypts data at


rest, rendering it unusable. Ransomware attackers decrypt the data once the victim
pays a fee.

 A data breach can occur if data at rest is moved or leaked into an unsecured
environment. Data breaches can be intentional, as when an external attacker
or malicious insider purposefully accesses the data to copy or leak it. They can also
be accidental, such as when a server is left exposed to the public Internet, leaking
the data stored within.

 Unauthorized or excessive access to data at rest also puts it at risk. Attackers may
fake or steal credentials to gain access.

 Physical theft can impact data at rest if someone steals the laptop, tablet,
smartphone, or other device on which the data at rest lives.
What is data at rest encryption?

Encryption is the process of scrambling data in such a way that it can only be unscrambled
by using a key (a key is a string of randomized values, like "FFBD29F83C2DA1427BD"). Hard
disk encryption is the technology used to encrypt data at rest.

Data at rest encryption is like locking away important papers in a safe. Only those with the
key can access the stored papers; similarly, only parties with the encryption key can access
data at rest.

Encrypting data at rest protects it from negative outcomes like data breaches, unauthorized
access, and physical theft. Without the key, the data is useless.

(Note that encryption is also crucial for protecting data in transit. The main technology for
encrypting data in transit is Transport Layer Security/TLS — learn more about TLS here.)

How does identity and access management (IAM) protect data at rest?

Restricting who can access data is a crucial part of protecting it. The more people who can
access data, the greater the chances of a breach. And without strong access controls,
unauthorized parties may be able to alter, copy, steal, or destroy data at rest. In fact, many
ransomware attacks use lateral movement to acquire the credentials they need to access,
and then alter, data at rest.

Identity and access management (IAM) is the practice of managing a user's identity and
what they are allowed to do. IAM helps keep data at rest secure by authenticating users and
checking their authorization for viewing and editing data at rest.

Why is protecting data at rest important in cloud computing?

Before the Internet and cloud computing, data at rest was kept on a user's computer or on
an organization's on-premise servers. However, as many organizations move to the cloud,
data at rest is stored on remote servers managed by an external vendor. Without direct
access to the data, organizations that use cloud infrastructure should evaluate their
providers' cloud storage security measures and make sure their cloud deployments are
configured correctly.

Cloud security posture management (CSPM) tools can help automate the process of
identifying security misconfigurations that could compromise data at rest.

Additionally, Cloudflare Zero Trust protects data at rest whether it is stored locally or
remotely in the cloud. Learn more about how Cloudflare Zero Trust helps control access,
filter out malicious web traffic, and verify devices for better organizational security.
Data theft can devastate any company, resulting in lost profits, regulatory enforcement,
litigation, and reputational damage that can be difficult to overcome. Every organization
must protect its customer data and assure that sensitive information is kept safe.

That said, the data in your company’s possession is held in different states – and each of
these states has particular vulnerabilities. A security tactic that works for one state may be
inefficient for another. Knowing the biggest threats for each type of data can help you to
design controls that will keep all of your clients’ personally identifiable information safe.

What Are the Three States of Data?

To create the best information security system for your data, you’ll first need to understand
the differences among data at rest, data in motion, and data in use.

Data at Rest

“Data at rest” is data that is not being used or transferred. This means that the data is likely
being stored on a hard drive, flash drive, or another device. The term can also refer to data
stored in a cloud service, such as Microsoft Azure or Amazon Web Services (AWS). Data at
rest is easier to secure, but thieves typically see this data as a more attractive target
because there’s more of it to steal. Data at rest is also more vulnerable to malicious attacks
from employees who have access to the storage network.

Data in Motion/flight

This refers to data moving from one location to another, such as between two storage
devices. This includes downloads, transfers, or any other means of taking data from one
place to another. Of the three states, data in motion is by far the most vulnerable. In
particular, Man in the Middle (MitM) attacks (where a malicious actor inserts himself into a
transaction or transfer) are easier to execute with data in motion.

Data in Use

This refers to data being accessed or used at any given moment. If you are creating,
deleting, editing, or processing your data, then it is “in use” until such time as it is stored (at
rest) or transferred (in motion).

What Are the Best Practices for Keeping Data at Rest Safe?

As we discussed previously, data at rest can be a tempting target for hackers. The following
defense methods can help you protect your stored data.

Keep Your Storage Organized


Disorganized or unclassified data storage can create unnecessary vulnerabilities that hackers
will exploit. Creating a well-maintained and well-documented storage system will help you
to keep an eye on all of your information and to notice whether anything has been
disturbed. Classifying your data based on how valuable or vulnerable it is will also help you
to allocate your defense resources correctly.

Encrypt Your Data

Data encryption translates your data into a code that is undecipherable to anyone without
the correct encryption keys. Encrypting your data while it is stored means that even if
unauthorized access happens, the encrypted data will not be of use to the thieves.
Encryption is a powerful tool that can keep information from falling into the wrong hands.

Practice Due Diligence

It’s imperative that you understand where your data is stored and what safeguards are in
place. If you’re using cloud storage, your server will likely have a number of protections in
place; those are a great addition to a strong defense strategy. Depending on these outside
controls alone, however, can have disastrous results. Whatever cloud services you use, you
still have a responsibility to create security controls for your storage and make sure the data
is safe.

What Are the Best Practices for Keeping Data in Motion/Flight Safe?

Data in motion is harder to protect than stored data. These security measures will provide a
solid foundation for your data protection efforts.

Secure Your Network

Your company will need to access and transmit data to provide the best service for your
clients; leaving critical data in storage is simply not an option. This means that your private
network must be secured and protected so that data is safe while in transit. There are a
variety of ways to secure your network, such as implementing firewalls or using encrypted
connections and SSL certificates. Whatever method you choose, your goal should be to
assure that no outside actors are able to see or acquire sensitive data while it moves within
your company.

Restrict Access to Data

An important part of data protection is knowing who has access to it and who is responsible
for any transmissions. This means that restricting data access on a need-to-know basis and
implementing access logs can help you track when data is being moved and who was
responsible for moving it. If a breach occurs at this level, access control will make it easier to
determine where it originated and what must be done to prevent future breaches.
Encourage Good Security Hygiene

The safety of data transmitted within your company depends on the security of your staff’s
accounts. Usually data will be transmitted via email or messaging apps, and your employees
need to have unique names and strong passwords to keep that data secure. Multi-factor
authentication is also a great option for protecting data, and your employees should be
encouraged to implement such protection on their accounts.

Endpoint security should also be a concern, and any laptops or mobile devices used by your
staff must be secured. Training your staff on the importance of data security should be an
integral part of your overall data management plan.

Use ZenGRC for Data Protection

Your best defense in the face of data theft is preparedness. Part of that preparedness is a
risk management system tailored to your company’s needs. Organizing this important
information in spreadsheets can result in redundancies or worse, risks that slip through the
cracks. How can you best streamline your risk management and protect your customers’
data?

ZenGRC is an integrated software solution that allows you to view your entire risk landscape
in real time. This innovative platform will help you track and assign risks for all data,
whether it’s at rest or in motion.

Schedule a demo today to learn how ZenGRC can help create a risk management system
tailored to your data protection needs.

TOPIC 5:What is virtualized security?

Virtualized security, or security virtualization, refers to security solutions that are software-
based and designed to work within a virtualized IT environment. This differs from
traditional, hardware-based network security, which is static and runs on devices such as
traditional firewalls, routers, and switches.

In contrast to hardware-based security, virtualized security is flexible and dynamic. Instead


of being tied to a device, it can be deployed anywhere in the network and is often cloud-
based. This is key for virtualized networks, in which operators spin up workloads and
applications dynamically; virtualized security allows security services and functions to move
around with those dynamically created workloads.

Cloud security considerations (such as isolating multitenant environments in public cloud


environments) are also important to virtualized security. The flexibility of virtualized security
is helpful for securing hybrid and multi-cloud environments, where data and workloads
migrate around a complicated ecosystem involving multiple vendors.
What are the benefits of virtualized security?

Virtualized security is now effectively necessary to keep up with the complex security
demands of a virtualized network, plus it’s more flexible and efficient than traditional
physical security. Here are some of its specific benefits:

 Cost-effectiveness: Virtualized security allows an enterprise to maintain a secure


network without a large increase in spending on expensive proprietary hardware.
Pricing for cloud-based virtualized security services is often determined by usage,
which can mean additional savings for organizations that use resources efficiently.

 Flexibility: Virtualized security functions can follow workloads anywhere, which is


crucial in a virtualized environment. It provides protection across multiple data
centers and in multi-cloud and hybrid cloud environments, allowing an organization
to take advantage of the full benefits of virtualization while also keeping data secure.

 Operational efficiency:Quicker and easier to deploy than hardware-based security,


virtualized security doesn’t require IT teams to set up and configure multiple
hardware appliances. Instead, they can set up security systems through centralized
software, enabling rapid scaling. Using software to run security technology also
allows security tasks to be automated, freeing up additional time for IT teams.

 Regulatory compliance:Traditional hardware-based security is static and unable to


keep up with the demands of a virtualized network, making virtualized security a
necessity for organizations that need to maintain regulatory compliance.

How does virtualized security work?

Virtualized security can take the functions of traditional security hardware appliances (such
as firewalls and antivirus protection) and deploy them via software. In addition, virtualized
security can also perform additional security functions. These functions are only possible
due to the advantages of virtualization, and are designed to address the specific security
needs of a virtualized environment.

For example, an enterprise can insert security controls (such as encryption) between the
application layer and the underlying infrastructure, or use strategies such as micro-
segmentation to reduce the potential attack surface.

Virtualized security can be implemented as an application directly on a bare metal


hypervisor (a position it can leverage to provide effective application monitoring) or as a
hosted service on a virtual machine. In either case, it can be quickly deployed where it is
most effective, unlike physical security, which is tied to a specific device.
What are the risks of virtualized security?

The increased complexity of virtualized security can be a challenge for IT, which in turn leads
to increased risk. It’s harder to keep track of workloads and applications in a virtualized
environment as they migrate across servers, which makes it more difficult to monitor
security policies and configurations. And the ease of spinning up virtual machines can also
contribute to security holes.

It’s important to note, however, that many of these risks are already present in a virtualized
environment, whether security services are virtualized or not. Following enterprise
security best practices (such as spinning down virtual machines when they are no longer
needed and using automation to keep security policies up to date) can help mitigate such
risks.

How is physical security different from virtualized security?

Traditional physical security is hardware-based, and as a result, it’s inflexible and static. The
traditional approach depends on devices deployed at strategic points across a network and
is often focused on protecting the network perimeter (as with a traditional firewall).
However, the perimeter of a virtualized, cloud-based network is necessarily porous and
workloads and applications are dynamically created, increasing the potential attack surface.

Traditional security also relies heavily upon port and protocol filtering, an approach that’s
ineffective in a virtualized environment where addresses and ports are assigned
dynamically. In such an environment, traditional hardware-based security is not enough; a
cloud-based network requires virtualized security that can move around the network along
with workloads and applications.

What are the different types of virtualized security?

There are many features and types of virtualized security, encompassing network
security, application security, and cloud security. Some virtualized security technologies are
essentially updated, virtualized versions of traditional security technology (such as next-
generation firewalls). Others are innovative new technologies that are built into the very
fabric of the virtualized network.

Some common types of virtualized security features include:

 Segmentation, or making specific resources available only to specific applications


and users. This typically takes the form of controlling traffic between different
network segments or tiers.

 Micro-segmentation, or applying specific security policies at the workload level to


create granular secure zones and limit an attacker’s ability to move through the
network. Micro-segmentation divides a data center into segments and allows IT
teams to define security controls for each segment individually, bolstering the data
center’s resistance to attack.

 Isolation, or separating independent workloads and applications on the same


network. This is particularly important in a multitenant public cloud environment,
and can also be used to isolate virtual networks from the underlying physical
infrastructure, protecting the infrastructure from attack.

You might also like