Cloud Computing
Cloud Computing
com9831709283 29Aug2015 pg 1
Source: http://www.ibimapublishing.com/journals/CIBIMA/2011/875547/875547.html
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg 2
Virtual machines/servers can be specified in the Cloud. A given Cloud may contain multiple
servers/machines supporting various software/services. Connections to these services are made
using Web Services that are the basis of application program interfaces (APIs) commonly used in
Cloud Computing.
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg 4
Source: http://www.service-architecture.com/articles/cloud-
computing/cloud_computing_explained.html
Introduction: The integration of computer technology into science and daily life has
enabled the collection of massive volumes of data that cannot be practically analyzed
on a single commodity computer or even a Sun-SPARC server, because these
datasets are too large to fit in memory. Consider website transaction logs, credit card
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg 5
usage records, Genome data, RFID and other sensor readings, GIS data, GPS
locations of cell phones, and more. Whereas the massive amount of data and its
processing can be readily done a large pool of commodity cheap PC-hardware with
specialized software , already some companies like Google, Facebook, Yahoo,
Amazon, Microsoft are already doing these for their own business and some of them
like Google, Amazon, Microsoft are renting their pool of resources to users with “pay
per use” .
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg 6
Fig : Google use millions of cheap pc hardware to make their solutions (Courtsey
Google)
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg 7
Source: http://www.service-architecture.com/articles/cloud-computing/cloud_computing_categories.html
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg 9
Fig : Service Models , distinct features of IaaS, PaaS ans SaaS model (Courtsey
Google).
Source: http://itknowledgeexchange.techtarget.com/cloud-computing-enterprise/new-
hybrid-cloud-models-emerging/
From the point of deployment, cloud computing platform include three kinds, that is
public cloud, private cloud and hybrid cloud.
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
10
Public cloud means that the cloud infrastructure is owned by a cloud service provider
who tries to sell cloud computing services to the public or industry circle. Typically,
these clouds follow a pay-per-use scheme where users pay for the amount of time they
use the resources offered by the cloud, like Amazon’s EC2 , (Elastic Computing 2).
Private clouds are typically deployed inside a firewall and managed by the user
organization. Organization owns the software and hardware running in the cloud,
manages the cloud, and provides virtualized cloud resources. Example:
Eucalyptus (Elastic Utility Computing Architecture for Linking Your Programs To Useful
Systems, from California University Santa Barbara. ), provides an open-source
application that can be used to implement a cloud computing environment on a data-
center.
Federated Clouds – This category of solutions are enabled by placing consistent Cloud
Management Platform software on both a Public Cloud and Private Cloud, and leveraging
the federation capabilities between the two (or multiple) locations. These include VMware
vCloud, Apache CloudStack, and Virtustream xStream. Over time I would expect
OpenStack to add federation capabilities, as multi-cloud interop is one of the core tenants of
that project.
Workload Migration – These solutions are focused on ways to take existing applications
(virtualized or on bare metal) and either seamlessly migrate them to a Public Cloud, or
create synchronization between a Private Cloud and Public Cloud on a per-application
basis. These include AWS’s VM Import, CloudVelocity (podcast) and Ravello Systems
(podcast). Layer on top of this the emergence of new Public/Private PaaS platforms
(Apprenda, CloudFoundry, etc.) and we’re also beginning to see portability higher up the
stack.
Multi-Cloud Managers – In the past the definition of Hybrid centered around Public +
Private. But as more Public options have evolved to fill niches, offer differentiated pricing or
performance, or geographic presence, a new class of Hybrid has evolved – the multi-Public
hybrid. Companies like Enstratius (Dell), Rightscale and AppOps have emerged to providing
leading technology in navigating management and governance between clouds, Public or
Private.
Over the past 4-5 years, the definition of Hybrid Cloud took many forms and meant many
things to different people. The good news is we appear to be coming out of the hype cycle
and are beginning to have wider choices of hybrid solutions to address multiple use-case
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
11
and workload requirements. And as even more of these evolve to being software-centric
deployment models, I expect that we’ll see the acceleration of hybrid architectures expand
quite rapidly. And over time, we’ll begin to see more of the elements that are considered
“tools” or smaller functionality get integrated into larger cloud offerings (eg. AWS Cloud
Gateway, OpsWorks and CloudWatch).
Hybrid cloud means that the cloud infrastructure consists of more than two kind of
cloud say private cloud and public cloud in which each kind of cloud keep independent,
however they are combined with some standards or special techniques and data.
Hybrid clouds combine both private clouds and public clouds, getting the best features
each has to offer.
Amazon Elastic Compute Cloud (EC2) is a web service that provides resizable compute capacity in the
cloud, allowing your applications to respond quickly to rapid fluctuations in demand. At all times, you
can be responsive to your users while ensuring that your computing power is optimally used. With
Amazon EC2, you don't have to maintain excess servers in anticipation of future demand, or continue
to run servers at sub-optimal utilization rates when the demand decreases. Instead, you use auto-
scaling, which means you dial up or dial down the number of EC2 instances you need based on your
current load. Amazon provides many features for you to implement auto-scaling, including web service
APIs to start and stop EC2 instances very quickly. However, it is up to you to determine how many
EC2 instances you need at any given time. This paper focuses on how to implement auto-scaling with
Amazon Simple Queue Service (Amazon SQS).
Amazon SQS is a highly reliable, scalable message queuing service that enables asynchronous
message-based communication between distributed components of an application. Amazon SQS is a
complementary web service that enables you to build highly scalable EC2 applications easily. To learn
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
12
more about the benefits of using Amazon SQS with Amazon EC2 read "Getting Started with Amazon
SQS and Amazon EC2".
https://aws.amazon.com/articles/1464
Amazon Simple Storage Service, also known as Amazon S3, is data storage in the cloud. It is a
canonical example of the economies of scale Amazon is able to leverage by essentially partitioning
its existing datacenters and renting them out over the internet to users. Data storage is metered and
billed by the gigabyte; users are only charged for the capacity they use, with ability to elastically add
as much storage as needed. Amazon S3 can be used as a content delivery network to deliver huge
amounts of data publicly over the internet, or to power massive database in junction with Amazon
EC2.
An excellent example illustrating the power of Amazon Web Services is the creation of the New York
Times' TimesMachine service. With over 4TB of digitally scanned images of the New York Times
from their archives, processing, storage and delivery to customers would be very expensive from a
traditional IT standpoint. But, cloud computing is ideal for bursts of capacity, and as Derek Gottfrid
wrote in his blog for the New York Times, he simply spun up 100 parallel EC2 instances to process
the images into PDFs, rented out 4TB of storage from S3, and finished processing over 11 million
images in just under 24 hours, at an estimated cost of less than $1000. Like being able to pull
computing power from the air, using Amazon EC2 was clearly preferable to buying 100 new
computers for less than 24 hours of work.
Microsoft Azure: Azure is Microsoft's cloud service platform which relies on the
Microsoft's data centers. Azure is an integrated solution, providing both a computing
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
13
Platform-as-a-Service
Heroku is a developer-oriented cloud platform for Ruby, a programming language. Targeted toward
developers seeking a rapid development and testing environment on demand, the Heroku platform is
essentially a mass of servers configured to deliver Ruby web apps, sliced up into many provisions
called "dynos" and sold to developers on the fly, thus eliminating the need for a dedicated IT
manager at small, programming startups. Developers can add dynos as needed to meet increased
traffic loads. Its architecture contains all of the elements of a development environment typically
maintained in-house, but all aspects of configuration and provision are self-serviced over the
internet.
Google App Engine is similar to Heroku, but instead targets developers of Python and Java, two
programming languages. With Google App Engine, Google specializes on its core competency of
being very good at maintaining masses of servers and data centers, allowing users to instead
leverage their competency of programming. In other words, should a programmer have a brilliant
idea for a web application, she can use the Google App Engine backend for scalability. If that idea
becomes wildly popular, scaling from 1,000 users to 100,000 users is roughly as simple as paying
for the increased bandwidth and processing. Google AppEngine: AppEngine allows to develop
and deploy data-intensive services written in Java or Python. AppEngine provides a service to
host a relational database and enables embedding queries and update statements to that database
into the Java or Python code. As a database language,AppEngine supports GQL (Google Query
Language) which is an SQL dialect.. All configuration is done automatically. The pricing model
is a true pay-as-you-go model: That is, if a service is (virtually) inactive, no cost is incurred.
(Only a marginal monthly fee to keep the data of the database consistently is charged.) For active
services, Google charges network traffic and the CPU hours consumed for processing requests.
Software-as-a-Service
Google Apps, different from Google App Engine, is the suite of Google Calendar, Google Docs, and
Gmail. While none of the individual components of Google Apps are unique as web-based software,
what makes Google Apps cloud-based is the ability to use Google Apps on a specific domain for a
business. For example, a non-profit group could set up private calendars internal to the organization,
use domain-specific email hosted on Gmail, and share and collaboratively edit documents with
Google Docs. Without shilling too much for Google Apps, it does meet many of the characteristics
outlined by NIST unique to the cloud paradigm: resource pooling by Google to run email and
calendaring services, on-demand self-service, broad network access from any internet-connected
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
14
device including mobile-optimized versions of Google Apps, and in general, the commoditization of
IT assets as a service.
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
15
There are many technologies like Hadoop, MapReduce comes out of Cloud Computing
specially in Linux Open Source software area.
The Apache Hadoop project develops open-source software for reliable, scalable,
distributed computing. Hadoop includes these subprojects:
• Hadoop Common: The common utilities that support the other Hadoop subprojects.
• HDFS: A distributed file system that provides high throughput access to application
data.
• MapReduce: A software framework for distributed processing of large data sets on
compute clusters.
Hadoop History
• Dec 2004 – Google GFS paper published
• Apr 2007 – Yahoo! on 1000-node cluster
• Jan 2008 – An Apache Top Level Project
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
16
DDriver Description
Availability Users have the ability to access their resources at any time through a
standard internet connection like Google gmail.
Collaboration Users are starting to see the cloud as a way to work simultaneously
on common data and information like Google doc, spreadshhet.
Lower The pay-per-usage model allows an organization to pay only for the
Infrastructure resources it needs, with basically no investment in the physical
Costs resources available in the cloud.
Mobility Users have the ability to access data and applications from around
the globe.
Risk Reduction Organizations can use the cloud to test ideas and concepts before
making major investments in technology.
Scalability Users have access to a large amount of resources that scale based
on user demand.
Barrier Description
Interoperability A set of universal standards and/or interfaces has not yet been defined,
resulting in a significant risk of vendor lock-in.
Latency All access to the cloud is done via the internet, introducing latency into
every communication between the user and the provider.
Platform or Some cloud providers support specific platforms and languages only like
Language Google Apps for Python and Java, Microsoft for .NET
Constraints
Regulations There are concerns in the cloud computing community over
jurisdiction, data protection, fair information practices, and
international data transfer that are a concern mainly to organizations
that manage sensitive data.
Resource Security
Control
The main concern is data privacy: users do not have control of or know where their data
is being stored.
This can add complexity to the application design. There are vendors that provide hybrid cloud
solutions that facilitates taking advantage of cloud bursting.
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
20
First, Web Services using SOAP, REST, and JSON are discussed. This is followed by a history
of Web Services covering the Web Services Description Language (WSDL) and Universal
Description, Discovery, and Integration (UDDI).
SOAP
SOAP was originally part of the specification that included the Web Services Description
Language (WSDL) and Universal Description, Discovery, and Integration (UDDI). It is used
now without WSDL and UDDI. Instead of the discovery process described in the History of the
Web Services Specification section below, SOAP messages are hard-coded or genereated
without the use of a repository. The interaction is illustrated in the figure below. More on SOAP.
1. A service provider describes its service using WSDL. This definition is published
to a repository of services. The repository could use Universal Description, Discovery,
and Integration (UDDI). Other forms of directories could also be used.
2. A service consumer issues one or more queries to the repository to locate a
service and determine how to communicate with that service.
3. Part of the WSDL provided by the service provider is passed to the service
consumer. This tells the service consumer what the requests and responses are for the
service provider.
4. The service consumer uses the WSDL to send a request to the service provider.
5. The service provider provides the expected response to the service consumer.
SOAP
All the messages shown in the above figure are sent using SOAP. (SOAP at one time stood for
Simple Object Access Protocol. Now, the letters in the acronym have no particular meaning .)
SOAP essentially provides the envelope for sending the Web Services messages. SOAP
generally uses HTTP , but other means of connection may be used. HTTP is the familiar
connection we all use for the Internet. In fact, it is the pervasiveness of HTTP connections that
will help drive the adoption of Web Services. More on SOAP and Messaging.
The next figure provides more detail on the messages sent using Web Services. At the left of the
figure is a fragment of the WSDL sent to the repository. It shows a CustomerInfoRequest that
requires the customer's account to object information. Also shown is the CustomerInfoResponse
that provides a series of items on customer including name, phone, and address items.
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
24
At the right of this figure is a fragment of the WSDL being sent to the service consumer. This is
the same fragment sent to the repository by the service provider. The service consumer uses this
WSDL to create the service request shown above the arrow connecting the service consumer to
the service provider. Upon receiving the request, the service provider returns a message using the
format described in the original WSDL. That message appears at the bottom of the figure.
XML is used to define messages. XML has a tagged message format. You can see this in the
SOAP and REST examples in the first section and in the figure above. In each of the examples,
the tag <city> has the value of Burnsville. And </city> is the ending tag indicating the end of the
value of city. Both the service provider and service consumer use these tags. In fact, the service
provider could send the data shown at the bottom of this figure in any order. The service
consumer uses the tags and not the order of the data to get the data values.
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
25
Source: http://www.service-
architecture.com/articles/cloudcomputing/web_services_and_cloud_computing.html
Conclusion:
Lot of technology have been developed from research in Cloud computing like
Eucalyptus. Moreover, over 100 companies and 10 universities are using Hadoop for
large scale data analysis. Mid-level organisations, resourceful universities can make
use technologies to make their own private cloud with commodity hardware. In
commercial clouds like Amazon EC2, and in field of Genomics, use of for genomics
data analysis may be envisaged. Cloud computing which is beginning to emerge in
education is for the learning management systems (LMSs) in the cloud, like packages
Blackboard or Moodle.
Nowadays definitely a pertinent question would be asked before any system is built or
designed , is that how much can be done by cloud computing technology rather than
using proprietary costly hardware and software with muck lower cost.
Cloud Computing Introduction compiled [email protected] 29Aug2015 pg
26
2002 Amazon launched Amazon Web Services (AWS), a suite that included storage,
computation and others services.
2003 Google’s cluster 15,000+ commodity machines , ~100 die each day
2005, Google’s index covers 8 billion pages, 5-10k per page, Google is indexing 40–
80TB of data
2006 Amazon launched Elastic Compute Cloud (EC2) to small companies and users to
let users run their own computer applications in the cloud.
2007, June Heroku , (PaaS) supporting several programming languages, owned by Salesforce.com,
supported the Ruby programming language, but it has since added support
for Java, Node.js, Scala, Clojure and Python and PHP
2008, EUCALYPTUS, ”Elastic Utility Computing Architecture for Linking Your Programs To
Useful Systems” launched, the first open source AWS API compatible platform for deploying
PRIVATE CLOUDS
2009 Google began to offer enterprise applications based in browser as Google Apps
*** END **