0% found this document useful (0 votes)
5 views21 pages

Utility Computing, VM, Enterprise Grid Computing

Utility computing is a service model that provides on-demand computing resources to clients, charging based on actual usage rather than fixed fees. It allows businesses to scale their IT resources dynamically, reducing management complexity and operational costs while enhancing flexibility and efficiency. The implementation involves assessing needs, evaluating service providers, and ensuring security measures are in place to optimize resource utilization.

Uploaded by

h9179624
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views21 pages

Utility Computing, VM, Enterprise Grid Computing

Utility computing is a service model that provides on-demand computing resources to clients, charging based on actual usage rather than fixed fees. It allows businesses to scale their IT resources dynamically, reducing management complexity and operational costs while enhancing flexibility and efficiency. The implementation involves assessing needs, evaluating service providers, and ensuring security measures are in place to optimize resource utilization.

Uploaded by

h9179624
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 21

Utility Computing

Utility computing is a service provisioning model that offers computing resources


such as hardware, software, and network bandwidth to clients as and when they
require them on an on-demand basis. The service provider charges only as per the
consumption of the services, rather than a fixed charge or a flat rate.

Utility computing is a subset of cloud computing, allowing users to scale up and


down based on their needs. Clients, users, or businesses acquire amenities such as
data storage space, computing capabilities, applications services, virtual servers, or
even hardware rentals such as CPUs, monitors, and input devices.

The utility computing model is based on conventional utilities and originates from
the process of making IT resources as easily available as traditional public utilities
such as electricity, gas, water, and telephone services. For example, a consumer
pays his electricity bill as per the number of units consumed, nothing more and
nothing less. Similarly, utility computing works on the same concept, which is a
pay-per-use model.
The service provider owns and manages the computing solutions and
infrastructure, and the client subscribes to the same and is charged in a metered
manner without any upfront cost. The concept of utility computing is simple—it
provides processing power when you need it, where you need it, and at the cost of
how much you use it.

5-Step Process With Examples


The concept of utility computing is transformational. Service providers match
demand with delivery almost immediately. It enables businesses to transform
their organization from a traditional environment to a more dynamic,
adaptive, and service-oriented one. When demand exceeds capacity, utility
computing becomes the perfect solution to top up resources as and when
needed.

However, a major question that remains unanswered is where and how to get
started with utility computing? Let’s understand the implementation process
of utility computing with the help of examples.

Step 1: Determine the need

The initial steps involved are assessing internal organizational needs and the
combination of services and resources required. Utility computing-hosting
centers exist for a reason. They provide valuable, tightly integrated, fully
customized utility computing solutions and resources as per clients’ needs.
However, all of this will not matter if your organization is clueless about its
actual objectives and needs.

Step 2: Evaluate the service provider’s claims

Once your objectives are determined, evaluate if the utility computing


solution will align with your goals and missions. Understanding which tasks
will be supported and what level of resources or services will be provided is
important. Further, ask important questions:

 Will the service provider customize solutions according to your


needs?
 Will they leverage automation?

 Will their collection of computing resources, software, and


configuration provide maximum benefits to the users?

Evaluating the service provider’s offer is essential to determine whether their


service will empower users to be more effective in accomplishing their goals
on time.

Step 3: Assess the health of a computing resource

To assess the health of a computing resource, it is critical to deploy resource


monitoring tools that look after its security and dynamic resource
configuration requirements. Monitoring a utility computing resource involves
identifying failures in the network, storage, and application resources. One of
the best ways to make monitoring simple is using a collection of industry-
standard and site-specific tools and local configuration files to be completely
in sync with the overall utility computing environment.

Step 4: Identify the resource provisioning requirements

The next step involves analyzing the service provider’s capability to


customize and configure resources to meet customer needs and establishing a
load balance without overprovisioning or under provisioning resources. The
provisioning interface must use resource command-line interfaces, APIs, and
except scripts to detect potential failures and monitor changes.

Step 5: Map out a timeframe

Once the need, objectives, and type of resources are determined, the final step
for architecting a utility computing solution involves mapping out the
schedule, identifying when a specific resource might be needed, and for how
much time. This allows the service provider to release unused resources early
and improve the overall resource utilization strategy.
Examples of utility computing

The utility computing model has become one of the most popular IT
provisioning models. It offers many advantages to businesses, such as no
more internal IT management headaches and no requirement of software
licenses. The arrival of public cloud utility solutions has become a deal-
breaker in such a situation. The utility model aims to maximize the
productive use of resources and minimize the costs that come along with
them.

1. Travel reservation services


The travel and hotel industry is highly dependent on seasonal demand and
peak festival times. COVID-19 travel restrictions have also played a huge
part in setting new trends in the hospitality industry. As countries are opening
their borders to international travel, we are seeing a surge in demand for
tickets to exotic destinations.

Let’s assume you wish to travel to the Maldives and are looking to make a
flight and hotel booking through your travel app. Due to the rise in demand,
travel reservation applications will deploy additional infrastructural support
and virtual servers to manage the offset of travelers wanting to make their
reservations. This way, travel applications get extra resources onboard when
they require and pay only based on their consumption.

2. Online retailers
With Christmas and New Year around the corner, online retailers will witness
massive traffic jumps and endure extreme load on their servers. Let’s assume
you wish to do a little redecoration before the festive season hits, and you
turn to the Swedish Gods of DIY furnishing, aka Ikea. This is where utility
computing enters. Online retailers would deploy additional data storage space
to manage the online surge and get charged on a rental basis.

3. Startups and small businesses


Let’s say a startup sets up a new business and uses utility computing services
to rent hardware units such as CPUs and monitors. Once their business
catches the eye of thousands of customers, they would need to scale up by
requesting more resources, bandwidth, and data storage space. Through this
method, the startup only needs to pay for the resources it utilizes and can
focus on flourishing its business without the worry of ongoing operational
costs hindering business growth.

Key Benefits of Utility Computing for Enterprises


The transition to utility computing has been quite compelling. While there are
various benefits to utility computing, numerous beneficiaries make use of this
concept to the best of their ability. These include organizations with a great
deal of seasonality and swings in their business or those with annual peaks in
capacity demand.

Utility computing is an uncomplicated, scalable, and cost-effective approach


to managing IT needs. It is a bankable solution toward a rapid digital
transformation. This brings us to unwrapping the key benefits of utility
computing for enterprises.
Utility Computing Benefits

1. Removes the complexity of IT management

Before utility computing became prominent, the older system involved IT


being hooked onto the safekeeping of a large block of resources. Utility
computing has been instrumental in reducing the complexity of IT
architectures and their management. Signing up with a utility service provider
absolves the user from the responsibility of maintaining IT resources,
including hardware and software. The need for spending time and resources
on the maintenance of servers gets completely eliminated with this model.

2. Saves valuable time & resources

Growing complexities of networks are leading to the consumption of a large


amount of resource and management time. The end to the increased
complexity of networks will begin with utility computing. When the
maintenance and management of IT architectures and servers fall into the
service provider’s hands, organizations can conserve a lot of their precious
time, allowing themselves to focus on addressing other pressing business
concerns. Utility computing facilitates agility and integration between IT
resources and enterprises.

3. Offers complete flexibility

For years, enterprises have been looking for a model that provides flexibility
and a bottom-up provisioning system. Utility computing provides utmost
flexibility in terms of availability of resources, on-demand usage, billing
methods, and ease of accessing data anytime and anywhere. Utility
computing simplifies the process of handling peak needs. For instance, since
you don’t own the resources or are renting them for a long time, it becomes
extremely easy to change the number of services, thereby shrinking or
expanding them based on changes in season, demand, audience, or new
efficiencies.

4. Facilitates minimal financial layout and maximum

savings

Utility computing has created a storm in the business world primarily because
of its flexibility and better economics. Its pay-per-use method of billing lets
organizations pay for only those computing resources that they require. This
leads to maximum cost savings for organizations. From reduction in
operational costs, savings on capital expenses, and doing away with the initial
costs of acquiring new resources to significantly lowered IT costs, this model
is a complete package deal for enterprises across business verticals.

5. Allows shorter time to market

Utility computing allows resources to be supplied in small, incremental bites


as and when required by an organization. This helps organizations deliver
fast and demonstrable output, with a substantial return on investment, without
having to wait for the full implementation to achieve payoffs.

Utility Computing Best Practices


1. Assess current workload

Before adopting any utility computing strategy, the first step is to assess the
current workload and rightfully identify your organization’s needs. The next
step is to plan and develop a computing strategy that aligns with the overall
business strategy. A strong business use case needs to be made that evaluates
crucial points to look into. For example, a public utility computing system
wouldn’t be ideal if an organization deals with highly confidential data. Here,
offloading of seasonal workload is the better choice.

Therefore, it is essential to adopt a model that facilitates your business


objectives, minimizes risks, and adds to the value of the investment by
delivering at par or exceeding your organization’s needs.

2. Choose a reliable utility service provider

The basic foundation of any computing service is built on choosing a reliable


utility service provider. For an organization to partner with any service
provider, the latter must deliver on all counts, adopt the best in-built security
protocols, and conform to the maximal level of industry practices. A service
provider should also extend a marketplace of partnership modules and
solutions to further enhance the security of an organization’s deployment.

However, the true mark of a reliable utility service provider is their range of
security compliance and certifications. Ideally, this should be publicly
available, as in the case of leading providers such as Microsoft.

3. Uphold transparency about shared responsibility

When an organization partners with a utility service provider for a shift of


systems and data, it is a partnership of shared responsibility for security
implementation. The key lies in discovering which security tasks fall under
the organization’s responsibility and those handled by the service provider.
Leading IaaS and PaaS providers such as Amazon and Microsoft always
provide documentation to their customers, upholding absolute transparency
about where specific aspects of responsibility lie as per the different
deployment modules.

This clarity ensures the prevention of any security breaches owing to security
needs falling through the cracks.

4. Discuss all security concerns with the service provider

Beyond the shared responsibilities, an organization should ask its utility


service provider what security measures and processes have been undertaken.
It’s easy and even hazardous to assume that service providers have all
security handled. While it can be true, the security methods and procedures
may vastly vary from vendor to vendor.

Some aspects to take into consideration while discussing security


concerns with the utility service provider are:
 Geographically, where do the utility service provider’s servers
reside?

 What is the utility service provider’s protocol for suspected


security breaches and incidents? How is it mitigated?

 What is the level of technical support the utility service provider is


willing to extend?

 Does the utility service provider encrypt data, both in transit and at
rest?

 Who has access to the data stored in the computing system from
the service provider’s end?

 What authentication protocols and compliance requirements does


the utility service provider support?

 Does the utility service provider conduct frequent penetration


tests, and what are their results?
5. Ensure training for all employees

For any organization, its employees are the first line of defense in secure
utility computing. Their security practices and knowledge can either expose a
system to cyber-attacks or protect it. For this purpose, comprehensive
training for all employees is essential. This prevents hackers from accessing
credentials for utility computing tools, enables users to spot threats, and
equips them to respond to them appropriately. The threat landscape is
constantly evolving, and without top-to-bottom visibility of all systems
interacting with the organization’s data, to take stock of all the security
vulnerabilities and penetrate through them is next to impossible,

Hence, for advanced users such as IT security teams and administrators


directly involved with utility computing, organizations should consider
industry-specific training and certifications.

6. Maintain visibility for utility computing services


For many organizations, utility computing services can be diverse and pertain
to a short time. Moreover, these services can be spread across service
providers and geographies. According to research, utility computing services
on the cloud have an average lifespan of 2 hours. This can create a blind spot
in the computing environment, and if it can’t be seen, it can’t be secured.

Owing to this, it is essential for any organization that the utility computing
solution they adopt ensures the visibility of the entire ecosystem. This aids in
the better implementation of security policies throughout resources and helps
identify risks effectively.

7. Scrutinize security contracts and SLAs

Security contracts and SLAs are the only assurances an organization can
completely rely on when it comes to utility computing or any computing
service. However, according to the McAfee Cloud Adoption and Risk Report,
2019, 62% of service providers don’t disclose that the clients own their data.
This leaves a legally grey area.
Assess the terms and conditions, annexes, and appendices to determine who
the data owner is and what happens if your organization decides to abort the
services.

8. Set up identity and access management solution

One of the biggest identified computing threats is that of unauthorized access.


With each new passing day, hackers’ methods of gaining access to personal
information through attacks are becoming more sophisticated—a high-quality
identity and access management solution aids in mitigating these risks. As
per experts, organizations should look out for solutions that let them access
and define the least privilege policies. In other words, these policies should
be sketched based on permission capabilities adhering to the roles.

To better mitigate this risk of malicious actors gaining access to sensitive


information, organizations should equip all their systems with multi-factor
authentication. This prevents them from accessing sensitive information,
even if they manage to steal usernames and passwords. Additionally, if an
organization adopts solutions that work in the hybrid environment, it enables
consistent enforcement of policies for security staff and end users.

9. Check and recheck compliance requirements

Customer data safety is an asset for every organization and needs increased
protection, especially in the healthcare, finance, and retail sectors. These
sectors deal with personal information and, as such, are required to adhere to
strict compliance requirements geographically and otherwise. Special
compliance regulations such as HIPAA, PCI, CCPA, SOC 2, and GDPR need
to be considered.

Hence, before shortlisting any service provider, you should review specific
compliance needs and ensure that your chosen provider delivers on these
requirements.
10. Leverage automation

One excellent appeal of utility computing is its level of automation. An


organization should consider automating its processes before adopting any
computing service. This makes the processes within the organization, such as
application deployment, resource provisioning, or performance monitoring,
more seamless, enabling the workforce to focus on more crucial tasks such as
re-architecture.

For any organization, deciphering the vast digital work and adopting a
specific practice such as utility computing can be tricky. These best practices
for utility computing enable better

understanding and implementation of such computing services at the core.

Enterprise Grid Computing

Grid computing is when you have different girds (units of hardware) performing
for you. In Enterprise it is always expected that if a system fails another system
should take over the process being executed by the first system. Also suppose you
have a growing application. You keep on increasing hardware resource to meet
your application's performance. There would be some time when the kernel of OS
will busy only in managing the resource available with it. To overcome this
process we use different systems working for your single application. This is called
grid computing.

Grid computing is a distributed architecture of multiple computers connected by


networks to accomplish a joint task. These tasks are compute-intensive and
difficult for a single machine to handle. Several machines on a network collaborate
under a common protocol and work as a single virtual supercomputer to get
complex tasks done. This offers powerful virtualization by creating a single system
image that grants users and applications seamless access to IT capabilities.
How Grid Computing Works

A typical grid computing network consists of three machine types:

 Control node/server: A control node is a server or a group of servers


that administers the entire network and maintains the record for
resources in a network pool.
 Provider/grid node: A provider or grid node is a computer that
contributes its resources to the network resource pool.
 User: A user refers to the computer that uses the resources on the
network to complete the task.
Virtualization in Cloud Computing and Types
Virtualization is a technique how to separate a service from the
underlying physical delivery of that service. It is the process of creating a
virtual version of something like computer hardware. It was initially
developed during the mainframe era. It involves using specialized
software to create a virtual or software-created version of a computing
resource rather than the actual version of the same resource. With the
help of Virtualization, multiple operating systems and applications can
run on the same machine and its same hardware at the same time,
increasing the utilization and flexibility of hardware.
In other words, one of the main cost-effective, hardware-reducing, and
energy-saving techniques used by cloud providers is Virtualization.
Virtualization allows sharing of a single physical instance of a resource
or an application among multiple customers and organizations at one
time. It does this by assigning a logical name to physical storage and
providing a pointer to that physical resource on demand. The term
virtualization is often synonymous with hardware virtualization, which
plays a fundamental role in efficiently delivering Infrastructure-as-a-
Service (IaaS) solutions for cloud computing. Moreover, virtualization
technologies provide a virtual environment for not only executing
applications but also for storage, memory, and networking.
1) Host Machine: The machine on which the virtual machine is going to be
built is known as Host Machine.

2) Guest Machine: The virtual machine is referred to as a Guest Machine.

Work of Virtualization in Cloud Computing

Virtualization has a prominent impact on Cloud Computing. In the case of


cloud computing, users store data in the cloud, but with the help of
Virtualization, users have the extra benefit of sharing the infrastructure.
Cloud Vendors take care of the required physical resources, but these cloud
providers charge a huge amount for these services which impacts every user
or organization. Virtualization helps Users or Organisations in maintaining
those services which are required by a company through external (third-
party) people, which helps in reducing costs to the company. This is the way
through which Virtualization works in Cloud Computing.

Benefits of Virtualization

1) More flexible and efficient allocation of resources.

2) Enhance development productivity.

3) It lowers the cost of IT infrastructure.

4) Remote access and rapid scalability.

5) High availability and disaster recovery.

6) Pay peruse of the IT infrastructure on demand.

7) Enables running multiple operating systems.


Drawback of Virtualization

1) High Initial Investment: Clouds have a very high initial investment,


but it is also true that it will help in reducing the cost of companies.

2) Learning New Infrastructure: As the companies shifted from


Servers to Cloud, it requires highly skilled staff who have skills to
work with the cloud easily, and for this, you have to hire new staff
or provide training to current staff.

3) Risk of Data: Hosting data on third-party resources can lead to


putting the data at risk, it has the chance of getting attacked by any
hacker or cracker very easily.

Characteristics of Virtualization

Increased Security: The ability to control the execution of a guest program in


a completely transparent manner opens new possibilities for delivering a
secure, controlled execution environment. All the operations of the guest
programs are generally performed against the virtual machine, which then
translates and applies them to the host programs.

1) Managed Execution: In particular, sharing, aggregation, emulation, and


isolation are the most relevant features.

2) Sharing: Virtualization allows the creation of a separate computing


environment within the same host.

3) Aggregation: It is possible to share physical resources among several


guests, but virtualization also allows aggregation, which is the opposite
process.
Types of Virtualization

1) Application Virtualization
2) Network Virtualization
3) Desktop Virtualization
4) Storage Virtualization
5) Server Virtualization
6) Data virtualization

1. Application Virtualization: Application virtualization helps a user to have


remote access to an application from a server. The server stores all personal
information and other characteristics of the application but can still run on a local
workstation through the internet. An example of this would be a user who needs to
run two different versions of the same software. Technologies that use application
virtualization are hosted applications and packaged applications.

2. Network Virtualization: The ability to run multiple virtual networks with each
having a separate control and data plan. It co-exists together on top of one physical
network. It can be managed by individual parties that are potentially confidential to
each other. Network virtualization provides a facility to create and provision
virtual networks, logical switches, routers, firewalls, load balancers, Virtual Private
Networks (VPN), and workload security within days or even weeks.
etwork Virtualization
3. Desktop Virtualization: Desktop virtualization allows the users’ OS to be
remotely stored on a server in the data center. It allows the user to access their
desktop virtually, from any location by a different machine. Users who want
specific operating systems other than Windows Server will need to have a virtual
desktop. The main benefits of desktop virtualization are user mobility, portability,
and easy management of software installation, updates, and patches.

4. Storage Virtualization: Storage virtualization is an array of servers that are


managed by a virtual storage system. The servers aren’t aware of exactly where
their data is stored and instead function more like worker bees in a hive. It makes
managing storage from multiple sources be managed and utilized as a single
repository. storage virtualization software maintains smooth operations, consistent
performance, and a continuous suite of advanced functions despite changes, breaks
down, and differences in the underlying equipment.

5. Server Virtualization: This is a kind of virtualization in which the masking of


server resources takes place. Here, the central server (physical server) is divided
into multiple different virtual servers by changing the identity number, and
processors. So, each system can operate its operating systems in an isolated
manner. Where each sub-server knows the identity of the central server. It causes
an increase in performance and reduces the operating cost by the deployment of
main server resources into a sub-server resource. It’s beneficial in virtual
migration, reducing energy consumption, reducing infrastructural costs, etc.

Server Virtualization

6. Data Virtualization: This is the kind of virtualization in which the data is collected
from various sources and managed at a single place without knowing more about the
technical information like how data is collected, stored & formatted then arranged that
data logically so that its virtual view can be accessed by its interested people and
stakeholders, and users through the various cloud services remotely. Many big giant
companies are providing their services like Oracle, IBM, At scale, Cdata, etc.

You might also like