Cloud Computing Notes (UNIT - 01)
Cloud Computing Notes (UNIT - 01)
[ INTRODUCTION ]
This idea first came in the 1950s. In making cloud computing what it is today,
five technologies played a vital role. These are distributed systems and its
peripherals, virtualization, web 2.0, service orientation, and utility computing.
1. Distributed Systems:
In the networks, different systems are connected. When they target to send the
message from different independent systems which are physically located in
various places but are connected through the network. Some examples of
distributed systems are Ethernet which is a LAN technology, Telecommunication
network, and parallel processing. The Basic functions of the distributed systems
are −
2. Mainframe computing:
Mainframes which first came into existence in 1951 are highly powerful and
reliable computing machines. These are responsible for handling large data such
as massive input-output operations. Even today these are used for bulk
processing tasks such as online transactions etc. These systems have almost no
downtime with high fault tolerance. Apart from the performance, mainframe
computing is very expensive.
3. Cluster Computing :
In Cluster Computing, the computers are connected to make it a single
computing. The tasks in Cluster computing are performed concurrently by each
computer also known as the nodes which are connected to the network. So the
activities performed by any single node are known to all the nodes of the
computing which may increase the performance, transparency, and processing
speed. To eliminate the cost, cluster computing has come into existence. We can
also resize the cluster computing by removing or adding the nodes.
4. Grid Computing :
It was introduced in 1990. As the computing structure includes different
computers or nodes, in this case, the different nodes are placed in different
geographical places but are connected to the same network using the internet.
The other computing methods seen so far, it has homogeneous nodes that are
located in the same place. But in this grid computing, the nodes are placed in
different organizations.
5. Virtualization:
virtual layer over the hardware which allows the user to run multiple instances
It is the base on which major cloud computing services such as Amazon EC2,
VMware vCloud, etc work on. Hardware virtualization is still one of the most
6.Web 2.0:
It is the interface through which the cloud computing services interact with the
clients. It is because of Web 2.0 that we have interactive and dynamic web
pages. It also increases flexibility among web pages. Popular examples of web
2.0 include Google Maps, Facebook, Twitter, etc. Needless to say, social media is
8. Utility computing:
services such as compute services along with other major services such as
2. Scalability : Scalability is the ability of the system to handle the growing amount
of work by adding resources to the system. Continuous business expansion
demands a rapid expansion of cloud services.
4. Broad network access : One of the most interesting features of cloud computing
is that it knows no geographical boundaries. Cloud computing has a vast access
area and is accessible via the internet. You can access your files and documents
or upload your files from anywhere in the world, all you need is a good internet
connection and a device, and you are set to go.
7. Resilience : Resilience in cloud computing means its ability to recover from any
interruption. A Cloud service provider has to be prepared against any disasters or
unexpected circumstances since a lot is at stake.
1. Online Data Storage : Organizations have a lot of data to store and with time
the size of this data increases. This data can be in any format like text, image,
audio, or video. Now, in order to store and maintain this huge amount of data,
organizations are no longer needed to set physical storage systems.
2. Backup and Recovery : Cloud service providers offer a lot of options for data
recovery. They offer various recovery plans at different costs. Companies can
decide which plan they need based on their requirements.
The cloud provider gives the option for data redundancy, i.e., a copy of data is
stored in different places.
Big Data analysis involves dealing with huge amounts of data having sizes from
terabytes to zettabytes (known as big data).Cloud Computing allows us to store
large data sets that include structured, and unstructured data, from different
sources, and in different sizes from terabytes to zettabytes. Not only the storage
but also provides us various tools in order to do the analysis of this big data.