Utility Computing, VM, Enterprise Grid Computing
Utility Computing, VM, Enterprise Grid Computing
The utility computing model is based on conventional utilities and originates from
the process of making IT resources as easily available as traditional public utilities
such as electricity, gas, water, and telephone services. For example, a consumer
pays his electricity bill as per the number of units consumed, nothing more and
nothing less. Similarly, utility computing works on the same concept, which is a
pay-per-use model.
The service provider owns and manages the computing solutions and
infrastructure, and the client subscribes to the same and is charged in a metered
manner without any upfront cost. The concept of utility computing is simple—it
provides processing power when you need it, where you need it, and at the cost of
how much you use it.
However, a major question that remains unanswered is where and how to get
started with utility computing? Let’s understand the implementation process
of utility computing with the help of examples.
The initial steps involved are assessing internal organizational needs and the
combination of services and resources required. Utility computing-hosting
centers exist for a reason. They provide valuable, tightly integrated, fully
customized utility computing solutions and resources as per clients’ needs.
However, all of this will not matter if your organization is clueless about its
actual objectives and needs.
Once the need, objectives, and type of resources are determined, the final step
for architecting a utility computing solution involves mapping out the
schedule, identifying when a specific resource might be needed, and for how
much time. This allows the service provider to release unused resources early
and improve the overall resource utilization strategy.
Examples of utility computing
The utility computing model has become one of the most popular IT
provisioning models. It offers many advantages to businesses, such as no
more internal IT management headaches and no requirement of software
licenses. The arrival of public cloud utility solutions has become a deal-
breaker in such a situation. The utility model aims to maximize the
productive use of resources and minimize the costs that come along with
them.
Let’s assume you wish to travel to the Maldives and are looking to make a
flight and hotel booking through your travel app. Due to the rise in demand,
travel reservation applications will deploy additional infrastructural support
and virtual servers to manage the offset of travelers wanting to make their
reservations. This way, travel applications get extra resources onboard when
they require and pay only based on their consumption.
2. Online retailers
With Christmas and New Year around the corner, online retailers will witness
massive traffic jumps and endure extreme load on their servers. Let’s assume
you wish to do a little redecoration before the festive season hits, and you
turn to the Swedish Gods of DIY furnishing, aka Ikea. This is where utility
computing enters. Online retailers would deploy additional data storage space
to manage the online surge and get charged on a rental basis.
For years, enterprises have been looking for a model that provides flexibility
and a bottom-up provisioning system. Utility computing provides utmost
flexibility in terms of availability of resources, on-demand usage, billing
methods, and ease of accessing data anytime and anywhere. Utility
computing simplifies the process of handling peak needs. For instance, since
you don’t own the resources or are renting them for a long time, it becomes
extremely easy to change the number of services, thereby shrinking or
expanding them based on changes in season, demand, audience, or new
efficiencies.
savings
Utility computing has created a storm in the business world primarily because
of its flexibility and better economics. Its pay-per-use method of billing lets
organizations pay for only those computing resources that they require. This
leads to maximum cost savings for organizations. From reduction in
operational costs, savings on capital expenses, and doing away with the initial
costs of acquiring new resources to significantly lowered IT costs, this model
is a complete package deal for enterprises across business verticals.
Before adopting any utility computing strategy, the first step is to assess the
current workload and rightfully identify your organization’s needs. The next
step is to plan and develop a computing strategy that aligns with the overall
business strategy. A strong business use case needs to be made that evaluates
crucial points to look into. For example, a public utility computing system
wouldn’t be ideal if an organization deals with highly confidential data. Here,
offloading of seasonal workload is the better choice.
However, the true mark of a reliable utility service provider is their range of
security compliance and certifications. Ideally, this should be publicly
available, as in the case of leading providers such as Microsoft.
This clarity ensures the prevention of any security breaches owing to security
needs falling through the cracks.
Does the utility service provider encrypt data, both in transit and at
rest?
Who has access to the data stored in the computing system from
the service provider’s end?
For any organization, its employees are the first line of defense in secure
utility computing. Their security practices and knowledge can either expose a
system to cyber-attacks or protect it. For this purpose, comprehensive
training for all employees is essential. This prevents hackers from accessing
credentials for utility computing tools, enables users to spot threats, and
equips them to respond to them appropriately. The threat landscape is
constantly evolving, and without top-to-bottom visibility of all systems
interacting with the organization’s data, to take stock of all the security
vulnerabilities and penetrate through them is next to impossible,
Owing to this, it is essential for any organization that the utility computing
solution they adopt ensures the visibility of the entire ecosystem. This aids in
the better implementation of security policies throughout resources and helps
identify risks effectively.
Security contracts and SLAs are the only assurances an organization can
completely rely on when it comes to utility computing or any computing
service. However, according to the McAfee Cloud Adoption and Risk Report,
2019, 62% of service providers don’t disclose that the clients own their data.
This leaves a legally grey area.
Assess the terms and conditions, annexes, and appendices to determine who
the data owner is and what happens if your organization decides to abort the
services.
Customer data safety is an asset for every organization and needs increased
protection, especially in the healthcare, finance, and retail sectors. These
sectors deal with personal information and, as such, are required to adhere to
strict compliance requirements geographically and otherwise. Special
compliance regulations such as HIPAA, PCI, CCPA, SOC 2, and GDPR need
to be considered.
Hence, before shortlisting any service provider, you should review specific
compliance needs and ensure that your chosen provider delivers on these
requirements.
10. Leverage automation
For any organization, deciphering the vast digital work and adopting a
specific practice such as utility computing can be tricky. These best practices
for utility computing enable better
Grid computing is when you have different girds (units of hardware) performing
for you. In Enterprise it is always expected that if a system fails another system
should take over the process being executed by the first system. Also suppose you
have a growing application. You keep on increasing hardware resource to meet
your application's performance. There would be some time when the kernel of OS
will busy only in managing the resource available with it. To overcome this
process we use different systems working for your single application. This is called
grid computing.
Benefits of Virtualization
Characteristics of Virtualization
1) Application Virtualization
2) Network Virtualization
3) Desktop Virtualization
4) Storage Virtualization
5) Server Virtualization
6) Data virtualization
2. Network Virtualization: The ability to run multiple virtual networks with each
having a separate control and data plan. It co-exists together on top of one physical
network. It can be managed by individual parties that are potentially confidential to
each other. Network virtualization provides a facility to create and provision
virtual networks, logical switches, routers, firewalls, load balancers, Virtual Private
Networks (VPN), and workload security within days or even weeks.
etwork Virtualization
3. Desktop Virtualization: Desktop virtualization allows the users’ OS to be
remotely stored on a server in the data center. It allows the user to access their
desktop virtually, from any location by a different machine. Users who want
specific operating systems other than Windows Server will need to have a virtual
desktop. The main benefits of desktop virtualization are user mobility, portability,
and easy management of software installation, updates, and patches.
Server Virtualization
6. Data Virtualization: This is the kind of virtualization in which the data is collected
from various sources and managed at a single place without knowing more about the
technical information like how data is collected, stored & formatted then arranged that
data logically so that its virtual view can be accessed by its interested people and
stakeholders, and users through the various cloud services remotely. Many big giant
companies are providing their services like Oracle, IBM, At scale, Cdata, etc.