0% found this document useful (0 votes)
33 views24 pages

Unit V Cloud Computing Notes

The document discusses cloud security, emphasizing the importance of policies and technologies to protect cloud systems, data, and infrastructure. It outlines various challenges such as multi-tenancy, data mobility, and privacy issues, while also highlighting the benefits of centralized security and reduced costs. Additionally, it covers Software as a Service (SaaS) security practices and introduces the Open Cloud Consortium and Distributed Management Task Force as organizations focused on cloud standards and interoperability.

Uploaded by

Manu Tyagi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views24 pages

Unit V Cloud Computing Notes

The document discusses cloud security, emphasizing the importance of policies and technologies to protect cloud systems, data, and infrastructure. It outlines various challenges such as multi-tenancy, data mobility, and privacy issues, while also highlighting the benefits of centralized security and reduced costs. Additionally, it covers Software as a Service (SaaS) security practices and introduces the Open Cloud Consortium and Distributed Management Task Force as organizations focused on cloud standards and interoperability.

Uploaded by

Manu Tyagi
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

lOMoAR cPSD| 45391676

UNIT –V
Security, Standards and Applications

Lecture 1:
Security in Clouds
➢ Cloud Security, also known as cloud computing security, consists of a set of policies, controls,
procedures and technologies that work together to protect cloud-based systems, data, and
infrastructure.
➢ These security measures are configured to protect cloud data, support regulatory compliance
and protect customer’s privacy as well as setting authentication rules for individual users and
devices.
➢ From authenticating access to filtering traffic, cloud security can be configured to the exact
needs of the business and because these rules can be configured and managed in one place,
administration overheads are reduced and IT teams empowered to focus on other areas of the
business.
➢ The way cloud security is delivered will depend on the individual cloud provider or the cloud
security solutions in place. However, implementation of cloud security processes should be a
joint responsibility between the business owner and solution provider.
➢ For businesses making the transition to the cloud, robust cloud security is imperative. Security
threats are constantly evolving and becoming more sophisticated, and cloud computing is no
less at risk than an on-premise environment. For this reason, it is essential to work with a cloud
provider that offers best-in-class security that has been customized for your infrastructure.

Cloud Computing Security Challenges


In traditional data centers, IT managers put procedures and controls in place to build a
hardened perimeter around the infrastructure and data they want to secure. This configuration is
relatively easy to manage, since organizations have control of their servers location and utilize the
physical hardware entirely for themselves. In the private and public cloud, however, perimeter
boundaries blur and control over security diminishes as applications move dynamically and
organizations share the same remotely located physical hardware withstrangers.
➢ Multi-tenancy
Cloud computing users share physical resources with others through common software
virtualization layers. These shared environments introduce unique risks into a user’s resource
stack. For example, the cloud consumer is completely unaware of a neighbor’s identity,security
profile or intentions. The virtual machine running next to the consumer’senvironment could
be malicious, looking to attack the other hypervisor tenants or sniff communications moving
throughout the system. Because the cloud consumer’s data sits on common storage hardware,
it could become compromised through lax access management or malicious attack. In a joint
paper published in November 2009 by MIT and UCSD entitled “Hey, You, Get Off of My
Cloud: Exploring Information Leakage in Third-Party Compute Clouds,” the authors exhibited
the possibility of a side-channel attack in a cloud environment in which an attacker would be
able to implant some arbitrary code into a neighbor’s VM environment with little to no
chance of detection. In another scenario, a security bulletin from Amazon Web Services
reported that the Zeus Botnet was able to install and successfully run a command and control
infrastructure in the cloud environment.
➢ Data Mobility and Control
Moving data from static physical servers onto virtual volumes makes it remarkably mobile,
and data stored in the cloud can live anywhere in the virtual world. Storage administrators
can easily reassign or replicate users’ information across data centers to facilitate server
lOMoAR cPSD| 45391676

maintenance, HA/DR or capacity planning, with little or no service interruption or notice to data
owners. This creates a number of legal complications for cloud users. Legislation like the
EU Privacy Act forbids data processing or storage of residents’ data within foreign data
centers. Careful controls must be applied to data in cloud computing environments to ensure
cloud providers do not inadvertently break these rules by migrating geographically sensitive
information across political boundaries. Further, legislation such as the US Patriot Act allows
federal agencies to present vendors with subpoenas and seize data (which can include trade
secrets and sensitive electronic conversations) without informing or gaining data owners’
consent.
➢ Data Remanence
Although the recycling of storage resources is common practice in the cloud, no clear
standard exists on how cloud service providers should recycle memory or disk space. In
many cases, vacated hardware is simply re-purposed with little regard to secure hardware
repurposing. The risk of a cloud tenant being able to gather pieces of the previous tenants’
data is high when resources are not securely recycled. Resolving the issue of data remanence
can frequently consume considerable negotiating time while establishing service agreements
between an enterprise and a cloud service provider.
➢ Data Privacy
The public nature of cloud computing poses significant implications to data privacy and
confidentiality. Cloud data is often stored in plain text, and few companies have an absolute
understanding of the sensitivity levels their data stores hold. Data breaches are embarrassing
and costly. In fact, a recent report by the Cloud Security Alliance lists data loss and leakage
as one of top security concerns in the cloud. Recent laws, regulations and compliance
frameworks compound the risks; offending companies can be held responsible for the loss of
sensitive data and may face heavy fines over data breaches. Business impacts aside, loose
data security practices also harm on a personal level. Lost or stolen medical records, credit
card numbers or bank information may cause emotional and financial ruin, the repercussions
of which could take years to repair. Sensitive data stored within cloud environments must be
safeguarded to protect its owners and subjects alike.

Parameters for cloud security


There are numerous security issues for cloud computing as it encompasses many technologies
including networks, databases, operating systems, virtualization, resource scheduling, transaction
management, load balancing, concurrency control and memory management Security issues for
many of these systems and technologies are applicable to cloud computing. For example, the
network that interconnects the systems in a cloud has to be secure. Furthermore, virtualization
paradigm in cloud computing results in several security concerns. For example, mapping the virtual
machines to the physical machines has to be carried out securely. Data security involves encrypting
the data as well as ensuring that appropriate policies are enforced for data sharing. In addition,
resource allocation and memory management algorithms have to be secure. Finally, data mining
techniques may be applicable to malware detection in clouds.

Benefits of Cloud Security


1. Centralized security: Just as cloud computing centralizes applications and data, cloud
security centralizes protection. Cloud-based business networks consist of numerous devices
and endpoints that can be difficult to manage when dealing with shadow IT or BYOD.
Managing these entities centrally enhances traffic analysis and web filtering, streamlines the
monitoring of network events and results in fewer software and policy updates. Disaster
recovery plans can also be implemented and actioned easily when they are managed in one
place.
2. Reduced costs: One of the benefits of utilizing cloud storage and security is that it eliminates
the need to invest in dedicated hardware. Not only does this reduce capital expenditure, but it
lOMoAR cPSD| 45391676

also reduces administrative overheads. Where once IT teams were firefighting security issues
reactively, cloud security delivers proactive security features that offer protection 24/7 with
little or no human intervention.
3. Reduced Administration: When you choose a reputable cloud services provider or cloud
security platform, you can kiss goodbye to manual security configurations and almost constant
security updates. These tasks can have a massive drain on resources, but when you move them
to the cloud, all security administration happens in one place and is fully managed on your
behalf.
4. Reliability: Cloud computing services offer the ultimate in dependability. With the right cloud
security measures in place, users can safely access data and applications within the cloud no
matter where they are or what device they are using.

Lecture 2:
Software as a Service Security
➢ SaaS security is cloud-based security designed to protect the data that software as service
applications carry.
➢ It’s a set of practices that companies that store data in the cloud put in place to protect sensitive
information pertaining to their customers and the business itself.
➢ However, SaaS security is not the sole responsibility of the organization using the cloud
service. In fact, the service customer and the service provider share the obligation to adhere to
SaaS security guidelines published by the National Cyber Security Center (NCSC).
➢ SaaS security is also an important part of SaaS management that aims to reduce unused
licenses, shadow IT and decrease security risks by creating as much visibility as possible

Six SaaS Security best practices


One of the main benefits that SaaS has to offer is that the respective applications are on-demand,
scalable, and very fast to implement, saving companies valuable resources and time. On top of
that, the SaaS provider typically handles updates and takes care of software maintenance.
This flexibility and the fairly open access have created new security risks that SaaS security best
practices are trying to address and mitigate. Below are 6 security practices and solutions that
every cloud-operating business should know about.
1. Enhanced Authentication
Offering a cloud-based service to your customers means that there has to be a way for them to
access the software. Usually, this access is regulated through login credentials. That’s why
knowing how your users access the resource and how the third-party software provider handles
the authentication process is a great starting point Once you understand the various methods, you
can make better SaaS security decisions and enable additional security features like multifactor
authentication or integrate other enhanced authentication methods.
2. Data Encryption
The majority of channels that SaaS applications use to communicate employ TLS (Transport
Layer Security) to protect data that is in transit. However, data that is at rest can be just as
vulnerable to cyber attacks as data that is being exchanged. That’s why more and more SaaS
providers offer encryption capabilities that protect data in transit and at rest. It’s a good idea to
talk to your provider and check whether enhanced data encryption is available for all the SaaS
services you use.

3. Vetting and Oversight


With a stark increase in SaaS deployment, usage and demand, new SaaS vendors emerge on a
lOMoAR cPSD| 45391676

regular basis. This creates a competitive market and gives companies seeking the best SaaS
solutions for their business needs the upper hand. However, too many similar products can lead
to decision fatigue or rash decisions. When you choose your saas provider, apply the same review
and validation process you would with other vendors and compare optional security features that
might be available.
4. Discovery and Inventory
With increased digital literacy, software procurement is not only limited to IT departments but
can be practiced by almost every employee. Ultimately, this leads to shadow IT and security
loopholes. That’s why one of the most important SaaS security practices involves maintaining a
reliable inventory of what services are being used and the tracking of SaaS usage to detect unusual
or unexpected activity. Automated tools within SaaS management systems can send out alerts for
immediate notification.
5. Consider CASBs
It is possible that the SaaS provider that you are choosing is not able to provide the level of SaaS
security that your company requires. If there are no viable alternatives when it comes to the
vendor, consider cloud access security broker (CASB) tool options. This allows your company to
add a layer of additional security controls that are not native to your SaaS application. When
selecting a CASB –whether proxy or API-based –make sure it fits into your existing IT
architecture.
6. Maintain situational awareness
Last but not least, always monitor your SaaS use. Comprehensive SaaS management tools and
CASBs offer you a lot of information that can help you make the right decision when it comes
to SaaS security.

Common Cloud Security Standard

Cloud Security encompasses the technologies, controls, processes, and policies which combine to
protect your cloud-based systems, data, and infrastructure. It is a sub-domain of computer security
and more broadly, information security.
The most well-known standard in information security and compliance is ISO 27001, developed
by the International Organization for Standardization.
The ISO 27001 standard was created to assist enterprises in protecting sensitive data by best
practices.
Cloud compliance is the principle that cloud-delivered systems must be compliant with the
standards their customers require. Cloud compliance ensures that cloud computing services meet
compliance requirements.
lOMoAR cPSD| 45391676

Lecture 3:
Open Cloud Consortium
The Open Cloud Consortium (OCC)
▪ Supports the development of standards for cloud computing and frameworks for
interoperating between clouds;
▪ Develops benchmarks for cloud computing; and
▪ Supports reference implementations for cloud computing, preferably open source reference
implementations.
The OCC has a particular focus in large data clouds. It has developed the MalStone Benchmark
for large data clouds and is working on a reference model for large data clouds.
The Open Cloud Consortium (OCC) is a member driven organization that supports the
development of standards for cloud computing and frameworks for interoperating between
clouds; develops benchmarks for cloud computing; supports reference implementations for cloud
computing, preferably open source reference implementations; manages a testbed for Cloud
Computing called the Open Cloud Testbed (OCT); and sponsors workshops and other events
related to Cloud Computing. The OCC has a particular focus in large data clouds. It has developed
the MalStone Benchmark for large data clouds and is working on a reference model for large data
clouds.
The Open Cloud Consortium is organized into different working groups.
lOMoAR cPSD| 45391676

Some of the Research project of OCC are


Cloud Standards Coordination Overview and Contributing Organizations
Cloud-standards.org is a Wiki site for Cloud Standards Coordination. The goal of the wiki is to
document the activities of the various SDOs working on Cloud Standards. Cloud- standards.org is
an initiative for editing and sharing a general cloud computing standardization positioning, in
which more relevant cloud standardization initiatives can be seen and related. The first informal
proposal of the positioning can be seen at cloud standards positioning.

Open Cloud Testbed (OCT)


This working group manages and operates the Open Cloud Testbed. The Open Cloud Testbeduses
the Cisco C-Wave and UIC Teraflow Network for its network connections. Both use
wavelengths provided by the National Lambda Rail. Currently membership in this working group
is limited to OCC members who can contribute computing, networking, or other resources to the
Open Cloud Testbed.

Project Matsu
Project Matsu is a collaboration with NASA. The goal is to create a cloud containing all of Earth
Observing 1 (EO-1) satellite imagery from the Advanced Lander Imager (ALI) and the Hyperion
instruments and to make this data available to interested users. This working group is also
developing cloud-based services that can assist at times of disasters. The Project Matsu cloud can,
for example, be used to assist with image processing so that up to date images can be made
available to those providing disaster assistance.

The Open Science Data Cloud (OSDC) Working Group


The Open Science Data Cloud (OSDC) is cloud-based infrastructure that allows scientists to
manage, analyze, integrate and share medium to large size scientific datasets. The Institutefor
Genomic and Systems Biology at the University of Chicago uses the OSDC as the basis for
Bionimbus, a cloud for genomics and related data. John Hopkins University uses the OSDC to
provide bulk downloads of the Sloan Digital Sky Survey to astronomers around the world. NASA
uses the OSDC to make data from the EO-1 satellite available to interested parties. Partial funding
for the OSCD is provided by the Gordon and Betty Moore Foundationand the National Science
Foundation. OSDC Partners include Yahoo, who contributed equipment to the OSDC and Cisco
who is providing access to the Cisco C-Wave.

Distributed Management Task Force


Distributed Management Task Force (DMTF) standards enable effective management of IT
environments. The organization is composed of companies that collaborate on the development,
validation and promotion of infrastructure management standards. DMTF management standards
are critical to enabling interoperability among multi-vendor systems, tools and solutions within the
enterprise.
The DMTF is an industry standards organization working to simplify the manageability of
network-accessible technologies through open and collaborative efforts by leading technology
companies. DMTF creates and drives the international adoption of interoperable management
standards, supporting implementations that enable the management of diverse traditional and
emerging technologies including cloud, virtualization, network and infrastructure.
DMTF spans the globe with member companies and organizations representing varied industry
sectors. The DMTF board of directors is led by 14 industry-leading technology companies
including: Broadcom Limited, CA Technologies, Dell, Emerson Network Power, Hewlett
Packard Enterprise, Hitachi, Ltd., HP Inc., Intel Corporation, Lenovo, Microsoft Corporation,
NetApp, Software AG, TIM and VMware.
Supporting implementations that enable the management of diverse traditional and emerging
lOMoAR cPSD| 45391676

technologies – including cloud, virtualization, network and infrastructure – the DMTF works to
simplify the manageability of network-accessible technologies through open and collaborative
efforts by leading technology companies.

DMTF standards include:


• Cloud Infrastructure Management Interface (CIMI) – a self-service interface for infrastructure
clouds, allowing users to dynamically provision, configure and administer their cloud usage
with a high-level interface that greatly simplifies cloud systems management. The
specification standardizes interactions between cloud environments to achieve interoperable
cloud infrastructure management between service providers and their consumers and
developers, enabling users to manage their cloud infrastructure use easily and without
complexity.
• Common Information Model (CIM) – the CIM schema is a conceptual schema that defines
how the managed elements in an IT environment (for instance computers
orstorage area networks) are represented as a common set of objects and relationships
between them. CIM is extensible in order to allow product specific extensions to the common
definition of these managed elements. CIM uses a model based upon UML to define the CIM
Schema. CIM is the basis for most of the other DMTF standards.
• Common Diagnostic Model (CDM) – the CDM schema is a part of the CIM schema that
defines how system diagnostics should be incorporated into the management infrastructure.
• Web-Based Enterprise Management (WBEM) – defines protocols for the interaction between
systems management infrastructure components implementing CIM, a concept of DMTF
management profiles, that allows defining the behavior of the elements definedin the CIM schema,
the CIM Query Language (CQL) and other specifications needed for the interoperability of CIM
infrastructure.
• Systems Management Architecture for Server Hardware (SMASH) – a DMTF Management
Initiative that include management profiles for server hardware management. SMASH 2.0
allows for either WS-Management or SM-CLP (a command line protocol for interacting with
CIM infrastructure). SM-CLP was adopted as an International Standard in August 2011 by
the Joint Technical Committee 1 (JTC 1) of the International Organization for
Standardization (ISO) and the International Electrotechnical Commission (IEC).
• System Management BIOS (SMBIOS) – defines how the BIOS interface of x86architecture
systems is represented in CIM (and DMI).
• Alert Standard Format (ASF) – defines remote control and alerting interfaces for OS- absent
environments (for instance a system board controller of a PC).
• Desktop Management Interface (DMI) – the first desktop management standard. Due to the
rapid advancement of DMTF technologies, such as CIM, the DMTF defined an "end of life"
process for DMI, which ended March 31, 2005.
• Redfish – DMTF's Redfish API is an open industry standard specification and schema
designed to meet the expectations of end users for simple, modern and secure management
of scalable platform hardware. Created by the Scalable Platforms Management Forum
(SPMF), Redfish specifies a RESTful interface and utilizes JSON and OData to help
customers integrate solutions within their existing tool chains.
• Web Services Management (WS-MAN) – The DMTF’s Web Services Management (WS-
Man) provides interoperability between management applications and managed resources,
and identifies a core set of web service specifications and usage requirements that expose a
common set of operations central to all systems management.A SOAP- based protocol for
managing computer systems (e.g., personal computers, workstations, servers, smart devices),
WS-Man supports web services and helps constellations of computer systems and network-
based services collaborate seamlessly.
lOMoAR cPSD| 45391676

• Desktop and mobile Architecture for System Hardware (DASH) – a management standard
based on DMTF Web Services for Management (WS-Management), for desktop and mobile
client systems. WS-Management was adopted as an international standard by ISO/IEC in
2013.
• Configuration Management Database Federation (CMDBf) – facilitates the sharing of
information between configuration management databases (CMDBs) and other management
data repositories (MDRs). The CMDBf standard enables organizations to federate and access
information from complex, multi-vendor infrastructures, simplifying the process of
managing related configuration data stored in multiple CMDBs and MDRs.
• The Cloud Auditing Data Federation (CADF) – The Cloud Auditing Data Federation (CADF)
standard defines a full event model anyone can use to fill in the essential data needed to
certify, self-manage and self-audit application security in cloud environments. CADF is an
open standard that addresses this need by enabling cross-vendor information sharing via its
data format and interface definitions.
• Platform Management Components Intercommunication (PMCI) – a suite of specifications
defining a common architecture for intercommunication among management subsystem
components. This suite includes MCTP, PLDM and NC- SI specifications. The Platform
Management standard was adopted as a national standard by ANSI in 2013.
• Virtualization Management Initiative (VMAN) – a suite of specifications based on DMTF’s
CIM that helps IT managers: Deploy virtual computer systems, Discover/inventory virtual
computer systems, Manage lifecycle of virtual computer systems, Create/modify/delete
virtual resources and Monitor virtual systems for health and performance. VMAN was
adopted as a National Standard by the American National Standards Institute (ANSI)
International Committee for Information Technology Standards (INCITS) in June 2012.
• The Network Management Initiative (NETMAN) – addresses today’s complex data center
and network environments. The initiative will lead the industry toward the unification of
network management across traditional, cloud and software defined data center (SDDC)
environments with the development of integrated standards to address physical, virtual,
application-centric and software defined networks. While cloud, virtualization and software
defined networking have eased the use of network functions for consumers, the challenges
of deploying and managing the network supporting these infrastructures have magnified.
Addressing the current complexity and abstraction, DMTF’s NETMAN will provide the
necessary standards-based management models and interfaces to enable consistent, unified and
automated provisioning, deployment,configuration, and monitoring of network environments.

Lecture 4:
Standards for Application Developer
The reasons to adopt standards in cloud computing closely match the same logic that made
the universal usability of the Internet a reality: The more accessible data is, the more interoperable
software and platforms are, the more standardized the operating protocols are, the easier it will be
to use and the more people with use it -- and the cheaper it will be to implement, operate, and
maintain. Systems and software designers see this logic in action when they create a cloud platform
and don't have to worry about figuring out how to make it work with a dozen or so network
protocols. Cloud application developers feel the power of standards when they build an application
using a framework that guarantees almost 100 percent success in such areas as data access, resource
allocation, debugging, failover mechanisms, user interface reconfiguration, and error, data, and
exception handling... not to mention the shouts of joy when a developer realizes that a favored
toolkit can integrate into a favored development platform, sometimes with only the push of a
button.
Cloud standards that designers and developers can use in 2015 to help make software design
simpler, cheaper, and faster.
lOMoAR cPSD| 45391676

Cloud Standards Customer Council (CSCC)


CSCC is an end-user advocacy group that seeks to "accelerate cloud's successful adoption" as a
means to strengthen 21st century enterprises. It is not really a standards organization but a
facilitator; it works with existing standards groups to ensure that client requirements are
addressed as standards evolve. This group understands that the transition from a traditional IT
environment to a cloud-based environment can require significant changes, so it attempts to
guarantee that this transition won't cost end-users the choice and flexibility they enjoy with their
current IT environments. Another role of the CSCC is to advocate for the establishment of open,
transparent standards for cloud computing; the council believes that the agility and economic
efficiencies cloud offers are only possible if the performance, security, and interoperability
issues that arise during the transition to the cloud are answered in an open, transparent way.

Distributed Management Task Force (DMTF)

DMTF is an association of industry IT companies and professionals collaborating on and


promoting enterprise systems management and interoperability standards with a goal of
providing "common management infrastructure components for instrumentation, control, and
communication in a platform-independent and technology-neutral way." The DMTF sports
several areas of focus.

Open Virtualization Format (OVF)

The OVF standard, adopted as ISO 17203 by the International Organization for Standardization
(ISO), creates uniform formatting for virtual systems-based software. OVF is platform
independent, flexible, and open, and can be used by anyone who needs a standardized package
for creating a virtual software solution that requires interoperability and portability. OVF
simplifies management standards using the Common Information Model (CIM) to standardize
management information formats; this reduces design and development overhead by allowing
for quicker and more costeffective implementation of new software solutions.

Open Cloud Standards Incubator working group

The Open Cloud Standards Incubator working group's goal is to facilitate management
interoperability between in-enterprise private clouds and public and hybrid clouds. The
components — cloud resource management protocols, packaging formats, and security
mechanisms—address the increasing need for open, consistent cloud management architecture
standards.

Cloud Management Working Group (CMWG)


lOMoAR cPSD| 45391676

CMWG uses the Cloud Infrastructure Management Interface (CIMI) to visually represent the
total lifecycle of a cloud service so that you can enhance the implementation and management
of that service and make sure it is meeting service requirements. This group can explain how
to model the characteristics of an operation, allowing variation of your implementation to be
tested prior to final development; it does this with CIM, which creates data classes with well-
defined associations and characteristics, as well as a conceptual framework for organizing these
components. CIM uses discrete layers: core model, common model, and extension
representations.

Cloud Auditing Data Federation Working Group (CADF)

CADF works to standardize "audit events across all cloud and service providers" with the goal
of resolving significant issues in cloud computing due to inconsistencies or incompatibilities.
It seeks to ensure consumers of cloud computing systems that the security policies required on
their applications are properly managed and enforced.

European Telecommunications Standards Institute (ETSI)

ETSI is an organization that produces internationally-applicable standards in information and


communications technology to improve systems interoperability, efficiencies, and economies
through shared knowledge and expertise.

ETSI Technical Committee Cloud

ETSI Technical Committee Cloud examines issues arising from the convergence of IT and
telecommunications. With cloud computing requiring connectivity to extend beyond the local
network, cloud network scalability has become dependent on the ability of the telecom industry
to handle rapid increases in data transfer; it also works on issues related to interoperability and
security

Cloud Standards Coordination (CSC)

The CSC initiative is responsible for developing a detailed set of standards required to support
European Commission policy objectives that address security, interoperability, data portability,
and reversibility.

Global Inter-Cloud Technology Forum (GICTF)

GICTF is an organization promoting the standardization of network protocols and interfaces in


an effort to create a more reliable cloud services network that solves the problems of security,
data quality, system responsiveness, and reliability.
lOMoAR cPSD| 45391676

International Organization for Standardization (ISO)/International Electrotechnical


Commission (IEC)

ISO is a well-known, 70-year old, independent, non-governmental membership organization


made up of 163 member countries. It is the world's largest developer of voluntary international
technology standards. The IEC is more than 100 years old and is the leading

force behind international standards for all technologies involving the electrical, electronic,
and related fields.

Lecture 5:
Standards for Security
lOMoAR cPSD| 45391676
lOMoAR cPSD| 45391676

Standards for Messaging

❖ Simple Message Transfer Protocol (SMTP)

• SMTP is usually used for:


o Sending a message from a workstation to a mail server.
o Or communications between mail servers.

• Client must have a constant connection to the host to receive SMTP messages.

❖ Post Office Protocol (POP)

• Purpose is to download messages from a server.

• This allows a server to store messages until a client connects and requests them.

• Once the client connects, POP servers begin to download the messages and
subsequently delete them from the server
lOMoAR cPSD| 45391676

❖ Internet Messaging Access Protocol (IMAP)


• IMAP allows messages to be kept on the server.

• But viewed as though they were stored locally.

❖ Syndication (Atom & Atom Publishing Protocol, and RSS)


RSS
• The acronym “Really Simple Syndication” or “Rich Site Summary”.

• Used to publish frequently updated works—such as news headlines


• RSS is a family of web feed formats

Atom & Atom Publishing Protocol


• The Atom format was developed as an alternative to RSS

❖ Communications (HTTP, SIMPLE, and XMPP)


HTTP

• The acronym “Hypertext Transfer Protocol.


▪ HTTP is a request/response standard between a client and a server
▪ For distributed, collaborative,hypermedia information systems.

XMPP (Extensible Messaging and Presence Protocol)


• Used for near-real-time, extensible instant messaging and prsence information.

• XMPP remains the core protocol of the Jabber Instant Messaging and Presence
technology
SIMPLE
• Session Initiation Protocol for Instant Messaging and Presence Leveraging
Extensions
• For registering for presence information and receiving notifications.

• It is also used for sending short messages and managing a session of realtime
messages between two or more participants.

Lecture 6:
End User Access to Cloud Computing

➢ Cost Flexibility
◦ Online Market place
➢ Scalability
◦ Online Video retailer
➢ Adaptability
lOMoAR cPSD| 45391676

◦ Online Entertainment platform


➢ Hidden Complexity
◦ Access to services having sophisticated technology
➢ Context-driven Variability
◦ Intelligent Assistants
➢ Access to Information
◦ Ecosystem
Cost Flexibility – Online Market place
➢ Gains access to more powerful analytics online
➢ Concerned with economy due to “pay-as-you-go” cost structure
➢ Additionally cloud takes away the need to fund
▪ Building of hardware
▪ Installing software
▪ Paying dedicated software license fees
Etsy – Online Market place
➢ Online Market Place for handmade goods
◦ Environment for buyers and sellers together
◦ Provides recommendations for buyers.
➢ Cloud-based capabilities
◦ Company is able to analyze data from one billion monthly views of its Web
site
◦ Use the information to create product recommendations
Scalability
➢ Cloud enables businesses to add or provision computing resources just at the time
they’re needed
Scalability – Online Video retailer
➢ Netflix of cloud resources as it meets up and down demand for its Internet
subscription service for movies and TV shows
➢ Netflix streams many movies and shows on demand even at peak times
➢ Migrate the website and streaming service from the traditional data center to the cloud
environment
Adaptability
➢ Cloud applications adapt to diverse user groups with a diverse assortment of devices
lOMoAR cPSD| 45391676

Adaptability – Online Entertainment platform


🞂 ActiveVideo, creator of CloudTV, a cloud-based platform that unifies all forms of
content
◦ Web, television, mobile, social, video-on-demand – onto any video screen, be
it set-top boxes, PCs, or mobile devices
🞂 CloudTV leverages content stored and processed in the network cloud
◦ Significantly expand the availability of Web-based user experiences
◦ Allow operators to quickly deploy a consistent user interface across diverse set-
top boxes and connected devices
Hidden Complexity
➢ As complexity is masked from the end user, a company can expand its product and
service sophistication
◦ without increasing the level of user knowledge to utilize
➢ Upgrades and maintenance can be done in the “background” without the end user
having to participate
➢ Xerox - Cloud Print solution, enables the desired content in printed form wherever they
might be by using Xerox’s cloud to access printers outside their own organization
◦ Printing from the cloud requires quite a bit of data management – with
numerous files to be stored, converted to print-ready format and distributed to
printers - the
◦ complexity is hidden from users
Access to Information – Ecosystem
➢ Ecosystem connectivity enables information exchange across business partners
➢ HealthHiway, an online health information network
o Enables the exchange of health information and transactions among healthcare
providers, employers, payers, practitioners, third-party administrators and
patients in India
o By connecting more than 1,100 hospitals and 10,000 doctors, the company’s
software-as-a-service solution facilitates better collaboration and information
sharing, helping deliver improved care at a low cost, particularly important in
growing markets
lOMoAR cPSD| 45391676

Mobile Internet Devices and the Cloud

Advantages of MCC (Mobile Cloud Computing)

➢ Extending battery lifetime


◦ Computation offloading migrates large computations and complex processing
from resource-limited devices (i.e., mobile devices) to resourceful machines
(i.e., servers in clouds).
◦ Remote application execution can save energy significantly.
◦ Many mobile applications take advantages from task migration and remote
processing.
➢ Improving data storage capacity and processing power
◦ MCC enables mobile users to store/access large data on the cloud.
◦ MCC helps reduce the running cost for computation intensive applications.
◦ Mobile applications are not constrained by storage capacity on the devices
because their data now is stored on the cloud.
➢ Improving reliability and availability
◦ Keeping data and application in the clouds reduces the chance of lost on the
mobile devices.
◦ MCC can be designed as a comprehensive data security model for both service
providers and users:
➢ Protect copyrighted digital contents in clouds.
➢ Provide security services such as virus scanning, malicious code
detection, authentication for mobile users.
◦ With data and services in the clouds, then are always(almost) available even
when the users are moving.
➢ Dynamic provisioning
◦ Dynamic on-demand provisioning of resources on a fine-grained, self-service
basis
◦ No need for advanced reservation
➢ Scalability
◦ Mobile applications can be performed and scaled to meet the unpredictable user
demands
◦ Service providers can easily add and expand a service
lOMoAR cPSD| 45391676

➢ Multi-tenancy
◦ Service providers can share the resources and costs to support a variety of
applications and large no. of users.
➢ Ease of Integration
◦ Multiple services from different providers can be integrated easily through the
cloud and the Internet to meet the users’ demands.
MCC Applications
➢ Mobile Commerce
◦ M-commerce allows business models for commerce using mobile devices
◦ Examples: Mobile financial, mobile advertising, mobile shopping
◦ M-commerce applications face various challenges (low bandwidth, high
complexity of devices, security)
◦ Integrated with cloud can help address these issues
◦ Example: Combining 3G and cloud to increase data processing speed and
security level.
➢ Mobile Learning
◦ M-learning combines e-learning and mobility
◦ Traditional m-learning has limitations on high cost of devices/network, low
transmission rate, limited educational resources
◦ Cloud-based m-learning can solve these limitations
◦ Enhanced communication quality between students and teachers
◦ Help learners access remote learning resources
◦ A natural environment for collaborative learning
➢ Mobile Healthcare
◦ M-healthcare is to minimize the limitations of traditional medical treatment
(eg. Small storage, security/privacy, medical errors)
◦ M-healthcare provides mobile users with convenient access to resources(eg.
medical records)
◦ M-healthcare offers hospitals and healthcare organizations a variety of on-
demand services on clouds
◦ Examples
➢ Comprehensive health monitoring services
➢ Intelligent emergency management system
lOMoAR cPSD| 45391676

➢ Health-aware mobile devices (detect pulse-rate, blood pressure, level


of alcohol etc)
➢ Pervasive access to healthcare information
➢ Pervasive lifestyle incentive management (to manage healthcare
expenses)
➢ Mobile Gaming
◦ M-game is a high potential market generating revenues for service providers.
◦ Can completely offload game engine requiring large computing resource (e.g.,
graphic rendering) to the server in the cloud.
◦ Offloading can also save energy and increase game playing time (eg. MAUI
allows fine-grained energy-aware offloading of mobile codes to a cloud)
◦ Rendering adaptation technique can dynamically adjust the game rendering
parameters based on communication constraints and gamers’ demands
➢ Assistive technologies
◦ Pedestrian crossing guide for blind and visually-impaired
◦ Mobile currency reader for blind and visually impaired
◦ Lecture transcription for hearing impaired students
🞂 Other applications
◦ Sharing photos/videos
◦ Keyword-based, voice-based, tag-based searching
◦ Monitoring a house, smart home systems

Lecture 7:
Hadoop:
Hadoop is an open source software programming framework for storing a large amount of data
and performing the computation. Its framework is based on Java programming with some native
code in C and shell scripts.
Hadoop is an open-source software framework that is used for storing and processing large
amounts of data in a distributed computing environment. It is designed to handle big data and is
based on the MapReduce programming model, which allows for the parallel processing of large
datasets.
Hadoop has two main components:
• HDFS (Hadoop Distributed File System): This is the storage component of Hadoop, which
allows for the storage of large amounts of data across multiple machines. It is designed to
work with commodity hardware, which makes it cost-effective.
• YARN (Yet Another Resource Negotiator): This is the resource management component of
Hadoop, which manages the allocation of resources (such as CPU and memory) for
processing the data stored in HDFS.
• Hadoop also includes several additional modules that provide additional functionality, such
as Hive (a SQL-like query language), Pig (a high-level platform for creating MapReduce
programs), and HBase (a non-relational, distributed database).
lOMoAR cPSD| 45391676

• Hadoop is commonly used in big data scenarios such as data warehousing, business
intelligence, and machine learning. It’s also used for data processing, data analysis, and data
mining. It enables the distributed processing of large data sets across clusters of computers
using a simple programming model.

Features of hadoop:
1. It is fault tolerance.
2. It is highly available.
3. It’s programming is easy.
4. It have huge flexible storage.
5. It is low cost.
Advantages and Disadvantages of Hadoop

Advantages:
• Ability to store a large amount of data.
• High flexibility.
• Cost effective.
• High computational power.
• Tasks are independent.
• Linear scaling.

Disadvantages:
• Not very effective for small data.
• Hard cluster management.
• Has stability issues.
• Security concerns.
• Complexity: Hadoop can be complex to set up and maintain, especially for organizations
without a dedicated team of experts.
• Latency: Hadoop is not well-suited for low-latency workloads and may not be the best
choice for real-time data processing.
• Limited Support for Real-time Processing: Hadoop’s batch-oriented nature makes it less
suited for real-time streaming or interactive data processing use cases.
• Limited Support for Structured Data: Hadoop is designed to work with unstructured and
semi-structured data, it is not well-suited for structured data processing.
• Data Security: Hadoop does not provide built-in security features such as data
encryption or user authentication, which can make it difficult to secure sensitive data.
• Limited Support for Ad-hoc Queries: Hadoop’s MapReduce programming model is not
well-suited for ad-hoc queries, making it difficult to perform exploratory data analysis.
• Limited Support for Graph and Machine Learning: Hadoop’s core component HDFS
and MapReduce are not well-suited for graph and machine learning workloads,
specialized components like Apache Graph and Mahout are available but have some
limitations.
• Cost: Hadoop can be expensive to set up and maintain, especially for organizations with
large amounts of data.
• Data Loss: In the event of a hardware failure, the data stored in a single node may be
lost permanently.
• Data Governance: Data Governance is a critical aspect of data management, Hadoop
does not provide a built-in feature to manage data lineage, data quality, data cataloging,
data lineage, and data audit.
lOMoAR cPSD| 45391676

MapReduce
MapReduce is a parallel, distributed programming model in the Hadoop framework that can be
used to access the extensive data stored in the Hadoop Distributed File System (HDFS). The
Hadoop is capable of running the MapReduce program written in various languages such as Java,
Ruby, and Python. One of the beneficial factors that MapReduce aids is that MapReduce
programs are inherently parallel, making the very large scale easier for data analysis.

When the MapReduce programs run in parallel, it speeds up the process. The process of running
MapReduce programs is explained below.

• Dividing the input into fixed-size chunks: Initially, it divides the work into equal-sized
pieces. When the file size varies, dividing the work into equal- sized pieces isn’t the
straightforward method to follow, because some processes will finish much earlier than
others while some may take a very long run to complete their work. So one of the better
approaches is that one that requires more work is said to split the input into fixed-size chunks
and assign each chunk to a process.

• Combining the results: Combining results from independent processes is a crucial task in
MapReduce programming because it may often need additional processing such as
aggregating and finalizing the results.

Virtual Box
Oracle Corporation developed a virtual box, and it is also known as VB. It acts like a hypervisor
for X86 machines. Originally, it was created by Innotek GmbH, and they made it accessible to
all in 2007. After that, it was bought by Sun Microsoft in 2008. Since then, it has been developed
by Oracle, and people refer to it as Oracle VM Virtual Box. VirtualBox comes in a variety of
flavors, depending on the operating system for which it is configured. VirtualBox Ubuntu is more
common, however, VirtualBox for Windows is also popular. With the introduction of Android
phones, VirtualBox for Android has emerged as the new face of virtual machines in smartphones.

Advantages of Virtual Box


• Isolation - A virtual machine's isolated environment is suitable for testing software or
running programmes that demand more resources than are accessible in other settings.

• Virtualization- VirtualBox allows users to run another OS on a single computer without


purchasing a new device. It generates a virtual machine that functions just like a real
computer, with its own processing cores, RAM, and hard disc space dedicated only to the
virtual environment.
• Cross-Platform Compatability- VirtualBox can run Windows, Linux, Solaris, Open
Solaris, and MacOS as its host operating system (OS). Users do not have to be concerned
about compatibility difficulties while setting up virtual computers on numerous devices or
platforms.

• Easy Control Panel- VirtualBox's simple control interface makes it easier to configure
parameters like CPU cores and RAM. Users may begin working on their projects within a
few moments of installing the software program on their PCs or laptops.

• Multiple Modes- Users have control over how they interact with their installations. Whether
in full-screen mode, flawless window mode, scaled window mode, or 3D graphics
acceleration. This allows users to customize their experience according to the kind of project
they are working on.
lOMoAR cPSD| 45391676

Disadvantages of Virtual Box

• VirtualBox, however, relies on the computer's hardware. Thus, the virtual machine will only
be effective if the host is faster and more powerful. As a result, VirtualBox is dependent on
its host computer.

• If the host computer has any defects and the OS only has one virtual machine, just that system
will be affected; if there are several virtual machines operating on the same OS, all of them
would be affected.

• Though these machines act like real machines, they are not genuine; hence, the host CPU
must accept the request, resulting in delayed usability. So, when compared to real computers,
these virtual machines are not as efficient.

Lecture 8:
Google App Engine (GAE)
Pre-requisite: - Google Cloud Platform

A scalable runtime environment, Google App Engine is mostly used to run Web applications.
These dynamic scales as demand change over time because of Google’s vast computing
infrastructure. Because it offers a secure execution environment in addition to a number of
services, App Engine makes it easier to develop scalable and high-performance Web apps.
Google’s applications will scale up and down in response to shifting demand. Croon tasks,
communications, scalable data stores, work queues, and in-memory caching are some of these
services.

The App Engine SDK facilitates the testing and professionalization of applications by emulating
the production runtime environment and allowing developers to design and test applications on
their own PCs. When an application is finished being produced, developers can quickly migrate
it to App Engine, put in place quotas to control the cost that is generated, and make the
programmer available to everyone. Python, Java, and Go are among the languages that are
currently supported.

The development and hosting platform Google App Engine, which powers anything from web
programming for huge enterprises to mobile apps, uses the same infrastructure as Google’s large-
scale internet services. It is a fully managed PaaS (platform as a service) cloud computing
platform that uses in-built services to run your apps. You can start creating almost immediately
after receiving the software development kit (SDK). You may immediately access the Google
app developer’s manual once you’ve chosen the language you wish to use to build your app.

After creating a Cloud account, you may Start Building your App
• Using the Go template/HTML package
• Python-based webapp2 with Jinja2
• PHP and Cloud SQL
• using Java’s Maven

The app engine runs the programmers on various servers while “sandboxing” them. The app
engine allows the program to use more resources in order to handle increased demands.
lOMoAR cPSD| 45391676

Features of App Engine

Runtimes and Languages


To create an application for an app engine, you can use Go, Java, PHP, or Python. You can
develop and test an app locally using the SDK’s deployment toolkit. Each language’s SDK and
nun time are unique. Your program is run in a:
• Java Run Time Environment version 7
• Python Run Time environment version 2.7
• PHP runtime’s PHP 5.4 environment
• Go runtime 1.2 environment

Generally Usable Features


These are protected by the service-level agreement and depreciation policy of the app engine.
The implementation of such a feature is often stable, and any changes made to it are backward-
compatible. These include communications, process management, computing, data storage,
retrieval, and search, as well as app configuration and management. Features like the HRD
migration tool, Google Cloud SQL, logs, datastore, dedicated Memcached, blob store,
Memcached, and search are included in the categories of data storage, retrieval, and search.

Features in Preview
In a later iteration of the app engine, these functions will undoubtedly be made broadly
accessible. However, because they are in the preview, their implementation may change in ways
that are backward-incompatible. Sockets, MapReduce, and the Google Cloud Storage Client
Library are a few of them.

Experimental Features
These might or might not be made broadly accessible in the next app engine updates. They might
be changed in ways that are irreconcilable with the past. The “trusted tester” features, however,
are only accessible to a limited user base and require registration in order to utilize them. The
experimental features include Prospective Search, Page Speed, OpenID,
Restore/Backup/Datastore Admin, Task Queue Tagging, MapReduce, and Task Queue REST
API. App metrics analytics, datastore admin/backup/restore, task queue tagging, MapReduce,
task queue REST API, OAuth, prospective search, OpenID, and Page Speed are some of the
experimental features.

Third-Party Services
As Google provides documentation and helper libraries to expand the capabilities of the app
engine platform, your app can perform tasks that are not built into the core product you are
familiar with as app engine. To do this, Google collaborates with other organizations. Along with
the helper libraries, the partners frequently provide exclusive deals to app engine users.

Advantages of Google App Engine


The Google App Engine has a lot of benefits that can help you advance your app ideas. This
comprises:
1. Infrastructure for Security: The Internet infrastructure that Google uses is arguably the
safest in the entire world. Since the application data and code are hosted on extremely secure
servers, there has rarely been any kind of illegal access to date.

2. Faster Time to Market: For every organization, getting a product or service to market
quickly is crucial. When it comes to quickly releasing the product, encouraging the
development and maintenance of an app is essential. A firm can grow swiftly with Google
Cloud App Engine’s assistance.

3. Quick to Start: You don’t need to spend a lot of time prototyping or deploying the app to
users because there is no hardware or product to buy and maintain.
lOMoAR cPSD| 45391676

4. Easy to Use: The tools that you need to create, test, launch, and update the applications are
included in Google App Engine (GAE).

5. Rich set of APIs & Services: A number of built-in APIs and services in Google App Engine
enable developers to create strong, feature-rich apps.

6. Scalability: This is one of the deciding variables for the success of any software. When using
the Google app engine to construct apps, you may access technologies like GFS, Big Table,
and others that Google uses to build its own apps.

7. Performance and Reliability: Among international brands, Google ranks among the top
ones. Therefore, you must bear that in mind while talking about performance and reliability.

8. Cost Savings: To administer your servers, you don’t need to employ engineers or even do it
yourself. The money you save might be put toward developing other areas of your company.

9. Platform Independence: Since the app engine platform only has a few dependencies, you
can easily relocate all of your data to another environment.

You might also like