Unit 4-Cloud Computing Notes
Unit 4-Cloud Computing Notes
Case (b)
Under provisioning of resources results in losses by both user and provider. Users have paid for
the demand (the shaded area above the capacity) is not used by users.
Case (c)
Fig: Cloud resource deployment using an IGG (intergrid gateway) to allocate the VMs
from a Local cluster to interact with the IGG of a public cloud provider.
o Under peak demand, this IGG interacts with another IGG that can allocate resources
from a cloud computing provider.
o A grid has predefined peering arrangements with other grids, which the IGG manages.
o Through multiple IGGs, the system coordinates the use of InterGrid resources.
o An IGG is aware of the peering terms with other grids, selects suitable grids that
can provide the required resources, and replies to requests from other IGGs.
o Request redirection policies determine which peering grid InterGrid selects to
process a request and a price for which that grid will perform the task.
o An IGG can also allocate resources from a cloud provider.
o The InterGrid allocates and provides a distributed virtual environment (DVE).
o This is a virtual cluster of VMs that runs isolated from other virtual clusters.
A component called the DVE manager performs resource allocation and
management on behalf of specific user applications.
o The core component of the IGG is a scheduler for implementing provisioning
policies and peering with other gateways.
o The communication component provides an asynchronous message-passing mechanism.
The managers provide a public API for users to submit and control the VMs
.
Independent Service Management:
Independent services request facilities to execute many unrelated tasks.
Commonly, the APIs provided are some web services that the developer can use
conveniently.
Distributed VM Management
A distributed VM manager makes requests for VMs and queries their status.
This manager requests VMs from the gateway on behalf of the user application.
The manager obtains the list of requested VMs from the gateway.
This list contains a tuple of public IP/private IP addresses for each VM with Secure
Shell (SSH) tunnels.
It is not possible for a cloud infrastructure provider to establish its data centers at all
possible locations throughout the world.
4.2 Security
Virtual machines from multiple organizations have to be co-located on the same physical
server in order to maximize the efficiencies of virtualization.
Cloud service providers must learn from the managed service provider (MSP) model and
ensure that their customers' applications and data are secure if they hope to retain their
customer base and competitiveness.
Cloud environment should be free from abuses, cheating, hacking, viruses, rumors, and
privacy and copyright violations.
4.2.1 Cloud Security Challenges
In cloud model users lose control over physical security.
In a public cloud, users are sharing computing resources with other companies.
When users share the environment in the cloud, it results in data at risk of seizure
(attack).
Storage services provided by one cloud vendor may be incompatible with another
vendor’s services; this results in unable to move from one to the other.
Vendors create “sticky services”.
Sticky services are the services which makes end user, in difficulty while transporting
from one cloud vendor to another.
Example: Amazon’s “Simple Storage Service” [S3] is incompatible with IBM’s Blue Cloud, or
Google, or Dell).
Customers want their data encrypted while data is at rest (data stored) in the cloud
vendor’s storage pool.
Data integrity means ensuring that data is identically maintained during any operation
(such as transfer, storage, or retrieval).
Data integrity is assurance that the data is consistent and correct.
One of the key challenges in cloud computing is data-level security.
It is difficult for a customer to find where its data resides on a network controlled by its
provider.
Some countries have strict limits on what data about its citizens can be stored and for
how long.
Banking regulators require that customers’ financial data remain in their home country.
Security managers will need to pay particular attention to systems that contain critical
data such as corporate financial information.
Outsourcing (giving rights to third party) loses control over data and not a good idea
from a security perspective.
Security managers have to interact with company’s legal staff to ensure that appropriate
contract terms are in place to protect corporate data.
Cloud-based services will result in many mobile IT users accessing business data and
services without traversing the corporate network.
This will increase the need for enterprises to place security controls between mobile users
and cloud-based services.
Placing large amounts of sensitive data in a globally accessible cloud leaves
organizations open to large distributed threats—attackers no longer have to come onto the
premises to steal data, and they can find it all in the one "virtual" location.
Virtualization efficiencies in the cloud require virtual machines from multiple
organizations to be collocated on the same physical resources.
Although traditional data center security still applies in the cloud environment, physical
segregation and hardware-based security cannot protect against attacks between virtual
machines on the same server.
The dynamic and fluid nature of virtual machines will make it difficult to maintain the
consistency of security and ensure the auditability of records.
The ease of cloning and distribution between physical servers could result in the
propagation of configuration errors and other vulnerabilities.
Localized virtual machines and physical servers use the same operating systems as well
as enterprise and web applications in a cloud server environment, increasing the threat of
an attacker or malware exploiting vulnerabilities in these systems and applications
remotely.
Virtual machines are vulnerable as they move between the private cloud and the public
cloud.
4.2.2 Software as a Service Security (Or) Data Security (Or) Application Security (Or)
Virtual Machine Security.
Cloud computing models of the future will likely combine the use of SaaS (and other
XaaS's as appropriate), utility computing, and Web 2.0 collaboration technologies to leverage the
Internet to satisfy their customers' needs. New business models being developed as a result of the
move to cloud computing are creating not only new technologies and business operational
processes but also new security requirements and challenges.
Fig: Evolution of Cloud Services
SaaS plays the dominant cloud service model and this is the area where the most critical need for
security practices are required
Security issues that are discussed with cloud-computing vendor:
1. Privileged user access—Inquire about who has specialized access to data, and about the
hiring and management of such administrators.
2. Regulatory compliance—Make sure that the vendor is willing to undergo external
audits and/or security certifications.
3. Data location—Does the provider allow for any control over the location of data?
4. Data segregation—Make sure that encryption is available at all stages, and that these
encryption schemes were designed and tested by experienced professionals.
5. Recovery—Find out what will happen to data in the case of a disaster. Do they offer
complete restoration? If so, how long would that take?
6. Investigative support—Does the vendor have the ability to investigate any inappropriate
or illegal activity?
7. Long-term viability—What will happen to data if the company goes out of business? How
will data be returned, and in what format?
The security practices for the SaaS environment are as follows:
Security Management (People)
One of the most important actions for a security team is to develop a formal charter
for the security organization and program.
This will foster a shared vision among the team of what security leadership is driving
toward and expects, and will also foster "ownership" in the success of the collective
team.
The charter should be aligned with the strategic plan of the organization or company
the security team works for.
4.2.3 Security Governance
A security committee should be developed whose objective is to focus on providing
guidance about security initiatives with business and IT strategies.
A charter for the security team is typically one of the first deliverables from the
steering committee.
This charter must clearly define the roles and responsibilities of the security team
and other groups involved in performing information security functions.
Lack of a formalized strategy can lead to an unsustainable operating model and
security level as it evolves.
In addition, lack of attention to security governance can result in key needs of the
business not being met, including but not limited to, risk management, security
monitoring, application security, and sales support.
Lack of proper governance and management of duties can also result in potential
security risks being left unaddressed and opportunities to improve the business being
missed.
The security team is not focused on the key security functions and activities that are
critical to the business.
Cloud security governance refers to the management model that facilitates effective and
efficient security management and operations in the cloud environment so that an enterprise’s
business targets are achieved. This model incorporates a hierarchy of executive mandates,
performance expectations, operational practices, structures, and metrics that, when implemented,
result in the optimization of business value for an enterprise. Cloud security governance helps
answer leadership questions such as:
Risk Management
Effective risk management entails identification of technology assets; identification of
data and its links to business processes, applications, and data stores; and assignment of
ownership and custodial responsibilities.
Actions should also include maintaining a repository of information assets
A risk assessment process should be created that allocates security resources related to
business continuity.
Risk Assessment
Security risk assessment is critical to helping the information security organization make
informed decisions when balancing the dueling priorities of business utility and
protection of assets.
Vulnerability Assessment
Vulnerability assessment classifies network assets to more efficiently prioritize
vulnerability-mitigation programs, such as patching and system upgrading.
It measures the effectiveness of risk mitigation by setting goals of reduced
vulnerability exposure and faster mitigation
Security Images:
Virtualization-based cloud computing provides the ability to create "Gold image" VM
secure builds and to clone multiple copies.
Gold image VMs also provide the ability to keep security up to date and
reduce exposure by patching offline.
Data Privacy
Depending on the size of the organization and the scale of operations, either an individual
or a team should be assigned and given responsibility for maintaining privacy.
A member of the security team who is responsible for privacy or security compliance
team should collaborate with the company legal team to address data privacy issues
and concerns.
Hiring a consultant in privacy area, will ensure that your organization is prepared to
meet the data privacy demands of its customers and regulators.
Data Governance
The data governance framework should include:
Data inventory
Data classification
Data analysis (business intelligence)
Data protection
Data privacy
Data retention/recovery/discovery
Data destruction
Data Security
The challenge in cloud computing is data-level security.
Security to data is given by
Encrypting the data
Permitting only specified users to access the data.
Restricting the data not to cross the countries border.
For example, with data-level security, the enterprise can specify that this data is not allowed to
go outside of the India.
Application Security
This is collaborative effort between the security and product development team.
Application security
processes
Cloud Identity Administration: Cloud identity administrative functions should focus on life
cycle management of user identities in the cloud—provisioning, deprovisioning, identity
federation, SSO, password or credentials management, profile management, and administrative
management. Organizations that are not capable of supporting federation should explore cloud-
based identity management services. This new breed of services usually synchronizes an
organization’s internal directories with its directory (usually multitenant) and acts as a proxy IdP
for the organization.
Federated Identity (SSO): Organizations planning to implement identity federation that enables
SSO for users can take one of the following two paths (architectures):
• Implement an enterprise IdP within an organization perimeter.
• Integrate with a trusted cloud-based identity management service provider.
Both architectures have pros and cons.
Enterprise identity provider: In this architecture, cloud services will delegate authentication to
an organization’s IdP. In this delegated authentication architecture, the organization federates
identities within a trusted circle of CSP domains. A circle of trust can be created with all the
domains that are authorized to delegate authentication to the IdP. In this deployment architecture,
where the organization will provide and support an IdP, greater control can be exercised over
user identities, attributes, credentials, and policies for authenticating and authorizing users to a
cloud service.
4.5.4 SSL/TLS
Transport Layer Security (TLS) and its predecessor, Secure Sockets Layer (SSL), are
cryptographically secure protocols designed to provide security and data integrity for
communications over TCP/IP. TLS and SSL encrypt the segments of network connections at the
transport layer. Several versions of the protocols are in general use in web browsers, email,
instant messaging, and voice-over-IP. TLS is an IETF standard protocol which was last updated
in RFC 5246.
The TLS protocol allows client/server applications to communicate across a network in a
way specifically designed to prevent eavesdropping, tampering, and message forgery. TLS
provides endpoint authentication and data confidentiality by using cryptography. TLS
authentication is one-way—the server is authenticated, because the client already knows the
server's identity. In this case, the client remains unauthenticated. At the browser level, this means
that the browser has validated the server's certificate—more specifically, it has checked the
digital signatures of the server certificate's issuing chain of Certification Authorities (CAs).
Validation does not identify the server to the end user. For true identification, the end
user must verify the identification information contained in the server's certificate (and, indeed,
its whole issuing CA chain).This is the only way for the end user to know the "identity" of the
server, and this is the only way identity can be securely established, verifying that the URL,
name, or address that is being used is specified in the server's certificate. Malicious web sites
cannot use the valid certificate of another web site because they have no means to encrypt the
transmission in a way that it can be decrypted with the valid certificate.
Since only a trusted CA can embed a URL in the certificate, this ensures that checking the
apparent URL with the URL specified in the certificate is an acceptable way of identifying the
site. TLS also supports a more secure bilateral connection mode whereby both ends of the
connection can be assured that they are communicating with whom they believe they are
connected. This is known as mutual (assured) authentication. Mutual authentication requires the
TLS client-side to also maintain a certificate.
TLS involves three basic phases:
1. Peer negotiation for algorithm support
2. Key exchange and authentication
3. Symmetric cipher encryption and message authentication
During the first phase, the client and server negotiate cipher suites, which determine
which ciphers are used; makes a decision on the key exchange and authentication algorithms to
be used; and determines the message authentication codes. The key exchange and authentication
algorithms are typically public key algorithms. The message authentication codes are made up
from cryptographic hash functions. Once these decisions are made, data transfer may begin.