Unit 5
Unit 5
The use of multi-cloud and hybrid solutions is increasing. Many organizations like banks,
insurance companies, etc are using hybrid cloud service that offers a combination of both
private and public clouds to store their data. Now, businesses are dividing their workload
among multiple cloud service providers to control their data and resources as well as utilize the
strength of each cloud service provider. The use of multi-cloud minimizes the potential risks
and failure points and provides cost-effectiveness. In multi-cloud, you can choose a particular
service of a particular cloud service provider that meets your requirements instead of deploying
your entire application on that cloud. This will also ignite the cloud service providers to embed
new services.
Those days are gone when users need to write hundreds of lines of code to create applications
and solve real-world problems and have deep technical knowledge. Businesses can create
applications and make use of AI and its subdomains with low-code and no-code cloud solutions.
These solutions can help in the development of websites, apps, services, etc without having
any technical knowledge. This helps in reducing the time and cost involved to create these
solutions. These solutions increase product development speed and result in a smaller number
of errors.
3) Edge computing
Edge computing includes storage of data, data processing, and data analytics which is done
geographically nearer to the source. It means that the computation and storage of data are
brought closer to the source sensors and devices. Edge is about processing data closer to where
it's being generated, enabling processing at greater speeds and volumes, leading to greater
action-led results in real time. It provides many benefits like reduced latency, enhanced
efficiency, increased privacy, security, and a high rate of data transmission. It works in real-time
and processes data that is not bounded by time.
4) IoT
The Internet of Things (IoT) is a trend that is becoming popular day by day. IoT involves the use
of many sensors that generate huge amounts of data which gets storage on cloud servers. IoT
makes use of many sensors, and actuators and performs analysis on the data collected to yield
results that will help in taking business decisions. It involves connectivity among computers,
networks, and servers. It can remotely collect data and communicate with the devices.
IoT collects data from various sensors and devices and acts as an intermediator between
remote systems and smart device management. Smart connectivity plays a major role in
making IoT a trend in cloud computing.
5) AI and ML
One of the most trending technologies that are close to cloud computing is Artificial
Intelligence and Machine Learning. They are cost-effective technologies as they require high
computational power and storage for the collection of data and training. Major trends that will
grow in this sector in the upcoming years are self-automation, self-learning, personalized cloud,
high data security, and privacy
6) Serverless architecture/computing
This trend can be automatically scaled as per its demand. Serverless architecture offers many
advantages such as no requirement for system administration, low cost and liability, easy
management of operations, and enhanced user experience even in case of no internet.
7) DevSecOps
Cloud computing is used to provide programs to mobile devices via mobile cloud computing.
These mobile apps may be launched remotely with the help of development tools that offer
speed and flexibility. Cloud services may be used to build or update mobile cloud apps easily.
They may be provided to various devices, each with its operating system, computational
activities, and data storage requirements. As a result, users may access apps that would
otherwise be unavailable. The underlying concept of Mobile Cloud Computing is to execute
mobile applications over several devices at the same time.
Instant Development
Today, cloud companies have been developing mobile applications that would help customers
regularly. These applications are backed by upgrades that help in improving the performance of
applications. As these applications are continuously being improved, it has resulted in a quick
development taking place in Mobile Cloud Computing.
Flexibility
The mobile applications are being built with the aim of maximum reach and flexibility. Several
development approaches are being followed today and all these are supported using Mobile
Cloud Computing. In Mobile Cloud Computing, the customer can choose the services that are
needed for expanding their businesses and make it more flexible.
Secure
Mobile Cloud Computing is considered to be reliable by storing all the data on the cloud,
providing enhanced data security. The cloud-backed up data can be retrieved anytime in a
secured channel. All these mobile cloud-based applications are protected by a password, so in
case there is any loss/damage done to the mobile device, the cloud remains secured free from
all kinds of risks.
1)Flexibility
Mobile Cloud Computing is highly flexible as it allows access to data from any location at any
instance of time. All the customer requires is a device with an active Internet connection from
which he can access the cloud data.
Cloud computing applications developed by the companies can be easily accessed across
multiple user’s mobile platforms like IOS, Android. The cloud has easy access and can be easily
modified irrespective of the platform that the user is using.
3)Economical Option
With mobile cloud computing, the user can eliminate the hardware cost and this proves to be
one of the cost-effective alternates to use as well as maintain. Mobile Cloud Computing
possesses very little cost and the end-consumer needs to only pay for what he has used.
The data stored using the mobile Cloud application can be backed up with ease and then
retrieve whenever needed. A cloud disaster recovery plan comprises of storing and maintaining
replicas of data at different locations, by maintaining top-notched security levels.
1. Security and Privacy: It is difficult to identify and manage threats on mobile devices as
compared to desktop devices because in a wireless network there are more chances of
the absence of the information from the network.
2. Service Availability: Users often find complaints like a breakdown of the network,
transportation crowding, out of coverage, etc. Sometimes customers get a low-
frequency signal, which affects the access speed and storage facility.
3. Limited Energy source: Mobile devices consume more energy and are less powerful.
Mobile cloud computing increases battery usage of mobile devices which becomes an
important issue. Devices should have a long-life battery to access applications and
other operations. When the size of the altered code is small, the offloading consumes
more energy than local processing.
Applications of Mobile Cloud Computing-
In Mobile Computing, many applications store data in Mobile virtual commerce for e-banking
and e-shopping. We are seeing many applications like mobile mail, Mobile sensing, Mobile
health care, Mobile gaming, mobile social networking, Mobile commerce for banking purposes,
Mobile learning, etc.
1. Mobile Email: We regularly use this in our daily activities to check mail on mobile; it is the
best example.
2. Mobile Sensing: Nowadays, most of us use smartwatches, and we have an application on our
mobile to connect with smart watches and track the details.
3. Mobile Healthcare: In Health care, it plays a major role. It helps access all the patient records
and tracking and getting required alarms.
4. Mobile Gaming: In mobile gaming, people playing the games will interact only with the
screen interface of their device.
5. Mobile Social Networking: We are using many applications regarding social networking. This
mainly helps in connecting with different people and uploading videos. It not only provides
storing of data but it mostly takes care of Integration and Security for data.
6. Mobile Commerce: We are in Mobile commerce for many things like e-shopping, e-banking,
and e-advertising. We are using MCC here mainly because of Scalable processing power and
mainly takes care of security at the time of payments.
7. Multimedia Sharing: By seeing the name, we can easily understand this helps share data
from one mobile to another. Mobile users can share all types of od data. Again here, MCC helps
in sharing all types of data and security.
8. Mobile Learning: Mobile learning is trading one in all smartphones. Nowadays, many schools
are proposing learning through mobile. Teachers upload the materials in the applications;
students can use them to prepare for exams. Not only this, students can save all the material,
videos, or pictures. Here Data is scalable, and owners will request money as per the usage of
the student’s or teacher’s application process.
CLOUD AUTOMATION
Cloud automation is a broad term that refers to processes and tools that reduce or eliminate
manual efforts used to provision and manage cloud computing workloads and services.
Organizations can apply cloud automation to private, public and hybrid cloud environments.
Resource allocation. Autoscaling -- the ability to scale up and down the use of compute,
memory or networking resources to match demand -- is a core tenet of cloud computing. It
provides elasticity in resource usage and enables the pay-as-you-go cloud cost model.
Tagging. Assets can be tagged automatically based on specific criteria, context and
conditions of operation.
Security. Cloud environments can be set up with automated security controls that enable or
restrict access to apps or data, and scan for vulnerabilities and unusual performance levels.
Logging and monitoring. Cloud tools and functions can be set up to log all activity involving
services and workloads in an environment. Monitoring filters can be set up to look for
anomalies or unexpected events.
When implemented properly, cloud automation offers many benefits, such as the following:
it is faster, more secure and more scalable than manually performing tasks;
it leads to fewer errors, as organizations can construct more predictable and reliable
workflows; and
it contributes directly to better IT and corporate governance.
Cloud automation also enables IT teams, freed from repetitive and manual administrative tasks,
to focus on higher-level work that more closely aligns with an organization's business needs,
such as integrating higher-level cloud services or developing new product features.
COMMET CLOUD
CometCloud provides:
1. Infrastructures services for dynamic federation and coordination to enable on-demand scale-
up, scale-down and scale-out.
2. Programming support to enable a range of programming models and services for autonomic
monitoring and management of the infrastructure and applications.
The overarching goal of CometCloud is to realize virtual computational cloud infrastructure that
integrates local computational environments and public cloud services on-demand, and provide
abstractions and mechanisms to support a range of programming paradigms and real-world
applications on such an infrastructure. Specifically, CometCloud provides programming
abstractions and underlying mechanisms and services.
The service layer provides a range of services to supports autonomics at the programming and
application level.
The programming layer provides the basic framework for application development and
management. It supports a range of paradigms including the master/worker/BOT, workflows
and MapReduce/Hadoop.
Multimedia Cloud Computing
Internet is having a significant impact on the media-related industries which are using it as a
medium to enable delivery of their content to end-users.
Twitter users are tweeting an average 55 million tweets a day that includes web links and photo
albums. Web pages and other multimedia content are being delivered through content delivery
networks (CDN) technologies.
A different variant of CDN technology appeared in the mid 2000’s to support the streaming of
hundreds of high definition channels to paid customers. These CDNs had to deal with more
stringent Quality of Service (QoS) requirements to support users’ experience pertaining to high
definition video
A more recent variant of video CDNs involves the caching video content in cloud storage and
the distribution of such content using third-party network services that are designed to meet
QoS requirements of caching and streaming high definition video. For example, Netflix’s video
CDN has been developed on top of Amazon AWS. CloudFront is Amazon’s own CDN that uses
Amazon AWS and provides streaming video services using Microsoft Xboxes. While Cloud-based
CDNs have made a remarkable progress in the past five years, they are still limited in the
following aspects:
CDN service providers either own all the services they use to run their CDN services or they
outsource this to a single cloud provider. A specialized legal and technical relationship is
required to make the CDN work in the latter case.
Video CDNs are not designed to manage content (e.g., find and play high definition movies).
This is typically done by CDN applications. For example, CDNs do not provide services that
allow an individual to create a streaming music video service combining music videos from
an existing content source on the Internet (e.g., YouTube), his/her personal collection, and
from live performances he/she attends using his/her smart phone to capture such content.
This can only be done by an application managing where and when the CDN will deliver the
video component of his/her music program.
CDNs are designed for streaming staged content but do not perform well in situations where
content is produced dynamically. This is typically the case when content is produced,
managed and consumed in collaborative activities. For example, an art teacher may find and
discuss movies from different film archives, the selected movies may then be edited by
students. Parts of them may be used in producing new movies that can be sent to the
students’ friends for comments and suggestions. Current CDNs do not support such
collaborative activities that involve dynamic content creation.
In general, IPTV sends only the program requested by the viewer. A new stream is
transmitted to the viewer when the channel is changed. Traditional TV, however,
broadcasts all the channels simultaneously.
1. VOD: Video on demand (VOD) is an option available to the users of IPTV. Each
user is given the option to choose from a catalog of videos and watch them as
many times as required. This feature uses unicast transmission, whereas normal
TV broadcasts use multicast transmission. Real Time Streaming Protocol is used
for VOD.
2. DVR: IPTV allows users to watch TV shows broadcast in the past using digital
video recorder (DVR), which is also known as time shifted programming.
Providers of IPTV allow users to watch recorded shows without DVR devices.
There is a live DVR system at the provider’s end, making DVR more cost
effective and efficient. Users can watch replays or start a TV program over from
an interactive menu.
3. Live Television: IPTV allows users to watch live transmissions with minimal
latency. It provides live television broadcasts either with or without interactivity,
without being just like traditional TV broadcasts. The protocol used for live
television is Internet Group Management Protocol (IGMP) version 2.
The biggest limitation is that IPTV broadcasts requires a certain amount of consistent
bandwidth for data to be streamed in order to deliver right number of moving pictures
frames. So for providers with high IPTV customer base, customer could experience
packets loss and delays in transmission.
We know that cloud-based computing can reduce IT capital costs, labour costs, enhance
productivity and also be remarkably efficient. One of the analyses shows that a particular
organization or company that switched to the cloud has saved around 68–87% energy for its
officecomputing and carbon emission has been reduced.
The value for cloud computing services has continued to increase in spite of a decline in
economic activity. Also the growth in cloud computing revenue increased worldwide at a
compound annual growth rate of 28.8%, with the market increasing from $46.0 billion to
$210.3billion.
Growth in cloud computing has some consequences because of GreenHouse Gas (GHG)
emissions and sustainability. Clouds are utilized better and are less expensive to operate than
thetraditional data centres. Moreover, only the large organizations both commercial and
governmental will have the capital and expertise to achieve a similar level of effi ciency at a
lower cost.Because of this, most of the work done in internal data centres can be outsourced to
the cloud inthe coming years, resulting in reductions in energy consumption, energy expenses
and emissionsfrom data centre operations.
A research report analyzed the energy efficiency benefits of cloud computing, which
includes an assessment for SaaS (Software as a Service), PaaS (Platform as a Service) and IaaS
(Infrastructure as a Service) markets. The study examines the drivers and technical
developments related to cloud computing. Market forecasts include a quantification of energy
savings and reduction possibilities under a cloud computing scenario. Key issues addressed are:
● Cost-wise advantage of public cloud computing providers over traditional data centres.
● Kind of ROI that the cloud computing delivered from an energy efficiency perspective.
● Impact of using cloud computing on carbon emission from the data centre operations.
1. First, by migrating to the cloud, industries can achieve significant energy savings and
reduced pollution, in addition to savings from reduced server purchases and maintenance.
2. Second, the reduction in energy consumption was larger and not by a reduced number of
servers. This was due to two factors: usage of server is lower, power consumed is less and
forming the servers in subset based on PUE (power usage effectiveness), reduces the energy
consumption.
3. Third, the results do not reflect the energy consumed by the client devices.
There are two models for saving energy in office computing: the standard model and the cloud-
based model which are enabled by Google Apps. Migration to Google Apps affects energy
consumption in three ways.
1. Reduces direct energy for servers by 70–90%: The operations required for far fewer servers
and Google’s servers are fully loaded with more efficiency. An organization that hosts its own IT
services must have redundancy to prevent safety and more servers to manage the demand.
From the above, it results in more servers and server utilization of 10% or less.
2. Reduces energy for server cooling by 70–90%: More energy consumed by server means more
heat produced. This makes the AC cooling systems to work harder than their usual load. The
energy 1.5 watts of cooling required for each watt of direct power consumption by the server.
In addition, the large corporate data centres indirectly consume ~0.5 watts of power for each
watt of direct power consumption.
3. From the use of Google servers and heavy network traffi c, the energy increased by 2–3%: The
impressive savings from points 1 and 2 are not achieved easily. As a result of cloudbased
services, some of the energy consumption is added by using Google’s servers and heavy traffi c
on the Internet.
JUNGLE COMPUTING-
Jungle Computing is a Distributed Computing paradigm. It just emerged out of the plethora of
available distributed resources. A Jungle Computing System provides the user to use all
computing resources available in this environment, which contains clusters, clouds, grids,
desktop grids, supercomputers, stand-alone machines and mobile devices.
There are many reasons to use Jungle Computing Systems. To run an application in a single
system is not possible, because it may need more computing power than available. A single
system may not support all the requirements from different parts of an application, because
the computational requirements differ in some part of an application.
From the abstract view, all resources in a Jungle Computing System are equal in some way. The
system contains some amount of processing power, memory and storage, etc. The end-users
no need to consider about the resource located in a remote cloud or down the hall in a cluster,
but the compute resource run their application effectively. A Jungle Computing System is
heterogeneous because the properties of the resources differ in processor architecture,
memory capacity and performance. In the absence of central administration of these unrelated
systems, software systems like libraries and compilers may differ.
For example, if a permanent stand-alone system is available then grid resources have to be
reserved, whereas a cloud requires a credit card to get access. Also, using different interfaces,
the middleware used to access a resource will differ.
It is hard to run the applications on several resources due to the heterogeneity of Jungle
Computing Systems. The application have to be re-compiled or even partially re-written for
each used resource is to handle the changes available software and hardware resources. A
different middleware interface is required to use different middleware client software for each
resource. Jungle Computing Systems lack in terms of connectivity between resources. This
aspect reduces the usage of Jungle Computing.
The main aim of introducing Grid Computing over a decade ago was to provide efficient and
transparent wall-socket computing over a distributed set of resources. From then onwards,
several other distributed computing paradigms have been introduced such as peer-to-peer
computing, volunteer computing and recently Cloud Computing. These paradigms allocate the
majority of targets of grid computing to supply end-users with controlled, distributed resources
with as few exertions as possible. These novel paradigms lent towards a diverse bundle of
resources available for innovation in computing research, which incorporate stand-alone
systems, clusters, grids and clouds, etc.
It is very diffi cult for scientists to program and use clusters, grids and clouds being equipped
with multi-core processors and many-core ‘add-ons’. Despite the fact that the programming and
efficient use of many-cores is known to be hard, this is not the only problem. Even the
programmers must care about the potential for parallelism at all levels of granularity. Even now
the problem is more severe, because the use of a single high-performance computing
environment is not sufficient while increasing the desire for speed and scalability in most
scientifi c research domains. The need to access multiple platforms concurrently from within a
single application often is due to the impossibility of reserving a suffi cient number of compute
nodes at once in a single multi-user system. Also, the nature of consumers consisting different
platforms is another issue.
Distributed cloud computing vs edge computing-
Computing
Low High
Capability
Data
In the device itself At Severs
Processing
Parameters Edge Computing Distributed Computing
Location
Response
Low High
Time
Containers Docker and Kubernetes
Container package application software with their dependencies in order to abstract from the
infrastructure it runs on. Now containers basically offer a logical packaging mechanism in
which applications can be abstracted from the environment in which they actually run. Now,
this decoupling allows container-based applications to be deployed easily and consistently
regardless the target environment is a private data center, the public cloud even a
developer’s personal laptop.
Runs everywhere: It is an open-source tool and gives you the freedom to take advantage of on-
premises, Public & hybrid cloud infrastructure letting you move your workload anywhere you
want.
Automation: For instance, Kubernetes will control for you with a servable host of the container
that will be launched.
Interaction: Kubernetes is able to manage more clusters at the same time. & It allows not only
horizontal but even vertical scaling also.
Kubernetes Advantages
Automatic container schedule: Kubernetes may reschedule a container from one node to
another to increase resource utilization. This means you get more work out of the same
number of machines, which saves money.
Service discovery: When you have a bunch of services that need to communicate with each
other it’s critical that they are able to find each other first. This is especially true because
containers are automatically scheduled and may potentially get moved around. Thankfully,
Kubernetes makes it easy for containers to communicate with each other.
Self-Healing: Kubernetes automatically monitors containers and reschedules them if they crash
or are terminated when they shouldn’t. Kubernetes will also reschedule containers in the event
that the node that they’re living on fails.
Rolling Upgrades: Fortunately, Kubernetes has the ability to perform rolling updates. This is
where old containers are judiciously swapped out of a new version of the same containers all
without disrupting the service provided by the running application.
Kubernetes Disadvantages
Steep learning curve: Kubernetes is not an easy platform to learn, even for the most
experienced developers and DevOps engineers.
Install & configure: Kubernetes consists of multiple no. of components that should be
configured and installed separately to initialize the cluster. if you install Kubernetes manually
you should also configure the security which includes creating a certificate authority & issuing
the certificate
No high availability: Kubernetes does not provide high availability mode by default to create a
fault-tolerant cluster you have to manually configure HA for your ETCD cluster.
Compatibility issues: Sometimes when you have containers you may need to use Docker with
communities. But at that time communities were not compatible with existing Docker CLI and
composing tools. And it requires more effort during the migration whenever you have to
migrate to a stateless It actually requires many efforts.
Docker
Docker is a platform used to containerize your software, using which you can easily build your
application, the package with the dependencies required for your application into the container
further, these containers are easily shipped to run on other machines. Docker is simplifying the
DevOps methodology by allowing developers to create templates called images using which you
can create a lightweight, virtual machine called a container.
Docker is making things easier for software industries giving them the capabilities to automate
the infrastructure, isolate the application, maintain consistency, and improve resource
utilization.
Easy configuration: This is one of the key features of Docker in which you can easily deploy your
code in less time & effort as you can use Docker in a wide variety of environments. The
requirement of the infrastructure is no longer linked with the environment of the application
helping in configuring the system easier and faster.
You can use swarm: It is a clustering and scheduling tool for Docker containers, SO swarm used
the Docker API as a frontend which helps us to use various tools to the controller, it also helps
us to control clusters for Docker host as a single virtual host, it is a self-organizing group of
engines that is used to enable, pluggable backbends.
Manages security: Docker allows us to save secrets in the swarm itself. And then choose to give
services access to certain secrets. It includes some important commands to the engine like
secret inspection, secretly creating, etc.
Services: Service is a list of tasks that lets us specify the state of a container inside of a cluster.
Each task represents one instance of a container that should be running and Swan schedules
them across the nodes.
Docker Advantages
Build app only once: An application inside a container can run on a system that has Docker
installed. So there is no need to build and configure apps multiple times on different platforms.
More sleep and less worry: With Docker, you test your application inside a container and ship it
inside a container. This means the environment in which you test is identical to the one on
which the app will run in production
Portability: Docker containers can run on any platform. It can run on any local system, Amazon
EC2, Google Cloud, Virtual Box, etc.
Version control: Like git, Docker has a built version control system. Docker containers work just
like GIT repositories, allowing you to commit changes to your Docker images and version
control them.
Docker Disadvantages
Missing feature: It has got Missing features. There are tons of features that are under progress
like container self-registration, self-inspect copying files from host to container, and many
more.
Data in the container: When the container is going down after that it needs a backup and
recovery strategy although we have several solutions for that they are not automated or not
very scalable yet.
Graphical app: Docker was designed as a solution for deploying server apps that do not require
a graphical interface, while there are some creative strategies such as x11 video forwarding that
u can use to run GUI apps inside the container.
The benefit is few: Generally, only apps that are designed to run as a discrete set of
microservices stand to gain the most from containers, otherwise, Docker’s only real benefit is
that it can simplify application delivery by providing an easy package machinima.
Devops-
The word DevOps is a combination of the terms development and operations, meant to
represent a collaborative or shared approach to the tasks performed by a company's
application development and IT operations teams.
In its broadest meaning, DevOps is a philosophy that promotes better communication and
collaboration between these teams -- and others -- in an organization. In its most narrow
interpretation, DevOps describes the adoption of iterative software development, automation
and programmable infrastructure deployment and maintenance. The term also covers culture
changes, such as building trust and cohesion between developers and systems administrators
and aligning technological projects to business requirements. DevOps can change the software
delivery chain, services, job roles, IT tools and best practices.
Continuous integration and continuous delivery (CI/CD) or continuous deployment tools, with
an emphasis on task automation.
Systems and tools that support DevOps adoption, including software development, real-time
monitoring, incident management, resource provisioning, configuration management and
collaboration platforms.
A DevOps approach is one of many techniques IT staff use to execute IT projects that meet
business needs. DevOps can coexist with Agile and other continuous software development
paradigms; IT service management frameworks, such as ITIL; project management directives,
such as Lean and Six Sigma; and other strategies.
Some IT professionals believe that the simple combination of Dev and Ops isn't enough, and the
term DevOps should include business (BizDevOps), security (DevSecOps) or other areas.
At its core, DevOps and DevOps practices are shown to improve software quality and
development project outcomes for the enterprise. Such improvements take several forms,
including the following:
Product quality. The cyclical, iterative nature of DevOps ensures that products are tested
continuously as existing defects are remediated and new issues are identified. Much of this is
handled before each release, resulting in frequent releases that enable DevOps to deliver
software with fewer bugs and better availability compared to software created with traditional
paradigms.
Deployment management. DevOps integrates software development and IT operations tasks,
often enabling developers to provision, deploy and manage each software release with little, if
any, intervention from IT. This frees IT staff for more strategic tasks. Deployment can take place
in local infrastructure or public cloud resources, depending on the project's unique goals.
The concept of Home Automation aims to bring the control of operating your every day home
electrical appliances to the tip of your finger, thus giving user affordable lighting solutions,
better energy conservation with optimum use of energy. Apart from just lighting solutions, the
concept also further extends to have a overall control over your home security as well as build a
centralised home entertainment system and much more. The Internet of Things (or commonly
referred to as IoT) based Home Automation system, as the name suggests aims to control all
the devices of your smart home through internet protocols or cloud based computing.
The IoT based Home Automation system offer a lot of flexibility over the wired systems s it
comes with various advantages like ease-of-use, ease-of-installation, avoid complexity of
running through wires or loose electrical connections, easy fault detection and triggering and
above and all it even offers easy mobility.
Basic Setup-
Thus IoT based Home Automation system consist of a servers and sensors. These servers are
remote servers located on Internet which help you to manage and process the data without the
need of personalised computers. The internet based servers can be configured to control and
monitor multiple sensors installed at the desired location.
Controller: The Brain of Your System
The main controller or the hub is the most essential part of your Home Automation system
irrespective of whether you connect single or multiple sensors in your home. The main
controller or the hub is also referred to as gateway and is connected to your home router
through the Ethernet cable. All the IoT based sensors transmits or receive commands through
the centralised hub. The hub in turn receives the input or communicates the output to cloud
network located over the internet.
Due to this kind of architecture, it is possible to communicate with the centralised hub even
from remote and distant locations through your smartphone. All you need is just a reliable
internet connection at the hub location and the data package to your smartphone that helps
you connect to the cloud network.
Most of the smart home controllers available in the market from several manufacturers cater to
all three widely used protocols of wireless communication for Home Automation: ZigBee, Z-
Wave and Wi-Fi.
Clear picture of the manufacturing floor. Industrial IoT solutions are able to collect massive
amounts of data at production sites. Real-time processing and analysis of such data helps car
manufacturers improve the understanding of and optimize the entire production process,
introduce higher safety standards, and reduce losses. If you want to see how industrial IoT
solutions work in practice, check out ScienceSoft’s interactive smart factory demo.
Enhanced in-vehicle experience. Users can now enjoy a range of in-vehicle infotainment
systems, navigation solutions, telematics, driver assistance systems, etc. that add comfort and
efficiency to car ownership.
Safer roads for drivers and pedestrians. IoT-enabled safety systems such as object recognition,
pedestrian and lane detection, automatic braking systems, etc. offer driving assistance to
reduce human errors and make vehicles safer for everyone.
Remote access to vehicle info. Users can easily get relevant information about their cars, such
as fuel level or location in a parking lot, remotely via a mobile app, which saves time and
enhances driver experience.
Improved car maintenance. The data collected from IoT sensors installed in vehicles can
be analyzed to detect pre-failure car conditions, prompting users to take preventive measures
in order to avoid malfunctions and reduce the cost of car maintenance.
Fleet management & telematics. IoT enriches fleet management with new advanced
functionality, making it more cost-efficient and reducing the need for manual operations. The
IoT devices integrated into vehicles collect real-time data about vehicle speed, location, load,
fuel consumption, driver behavior, etc. Gaining insights from this data with the help of IoT
analytics solutions, fleet management operators can automatically calculate the most optimal
routes, monitor the driving habits of their employees, screen vehicle performance, and
leverage predictive maintenance to avoid business disruptions. For instance, DHL launched its
DHL SmarTrucking solution that uses IoT sensors to gather real-time fleet data like location,
weather, traffic, etc. for more efficient fleet scheduling and route optimization.
Connected cars. Integrated IoT sensors help connect cars and enable vehicle-to-vehicle (V2V)
interaction. Cars can share relevant information like location, route, speed, etc. This feature
helps prevent accidents and makes roads safer. In case of emergency, drivers of other cars in
close proximity may be notified to take preventive measures or slow down their car. V2V
connection also helps emergency vehicles navigate through traffic.
Predictive maintenance. IoT solutions constantly monitor vehicle condition to predict potential
issues. IoT sensors gather real-time data on fuel consumption, engine temperature, fluid levels,
run time, etc. This data is then analyzed to detect pre-failure conditions and alert the driver in
advance. This approach to maintenance, compared to traditional scheduled check-ups, can help
avoid unnecessary expenses and save time and efforts while also helping prevent vehicle
breakdowns.
IoT-enabled in-vehicle infotainment systems provide services like music streaming, navigation,
voice assistance, hands-free calling, etc. Volvo developed its Sensus Connect infotainment
system with cloud-based services such as improved navigation with 3D maps, free map
updates, an option to send destination instructions remotely, etc. to help connect their cars to
the wider web and simplify car ownership.
Semi-autonomous vehicles can take partial control of driving, braking, parking, or lane
changing. IoT solutions and integrated in-vehicle cameras help smart vehicles calculate the
safest and most efficient course of action to provide driving assistance and reduce the
likelihood of road accidents.
IOT in healthcare-
Before Internet of Things, patients’ interactions with doctors were limited to visits, and tele and
text communications. There was no way doctors or hospitals could monitor patients’ health
continuously and make recommendations accordingly.
Internet of Things (IoT)-enabled devices have made remote monitoring in the healthcare sector
possible, unleashing the potential to keep patients safe and healthy, and empowering
physicians to deliver superlative care. It has also increased patient engagement and satisfaction
as interactions with doctors have become easier and more efficient. Furthermore, remote
monitoring of patient’s health helps in reducing the length of hospital stay and prevents re-
admissions. IoT also has a major impact on reducing healthcare costs significantly and
improving treatment outcomes.
IoT is undoubtedly transforming the healthcare industry by redefining the space of devices and
people interaction in delivering healthcare solutions. IoT has applications in healthcare that
benefit patients, families, physicians, hospitals and insurance companies.
IoT for Patients - Devices in the form of wearables like fitness bands and other wirelessly
connected devices like blood pressure and heart rate monitoring cuffs, glucometer etc. give
patients access to personalized attention. These devices can be tuned to remind calorie count,
exercise check, appointments, blood pressure variations and much more.
IoT has changed people’s lives, especially elderly patients, by enabling constant tracking of
health conditions. This has a major impact on people living alone and their families. On any
disturbance or changes in the routine activities of a person, alert mechanism sends signals to
family members and concerned health providers.
IoT for Physicians - By using wearables and other home monitoring equipment embedded with
IoT, physicians can keep track of patients’ health more effectively. They can track patients’
adherence to treatment plans or any need for immediate medical attention. IoT enables
healthcare professionals to be more watchful and connect with the patients proactively. Data
collected from IoT devices can help physicians identify the best treatment process for patients
and reach the expected outcomes.
IoT for Hospitals - Apart from monitoring patients’ health, there are many other areas where
IoT devices are very useful in hospitals. IoT devices tagged with sensors are used for tracking
real time location of medical equipment like wheelchairs, defibrillators, nebulizers, oxygen
pumps and other monitoring equipment. Deployment of medical staff at different locations can
also be analyzed real time.
The spread of infections is a major concern for patients in hospitals. IoT-enabled hygiene
monitoring devices help in preventing patients from getting infected. IoT devices also help in
asset management like pharmacy inventory control, and environmental monitoring, for
instance, checking refrigerator temperature, and humidity and temperature control.
IoT for Health Insurance Companies – There are numerous opportunities for health insurers
with IoT-connected intelligent devices. Insurance companies can leverage data captured
through health monitoring devices for their underwriting and claims operations. This data will
enable them to detect fraud claims and identify prospects for underwriting. IoT devices bring
transparency between insurers and customers in the underwriting, pricing, claims handling, and
risk assessment processes. In the light of IoT-captured data-driven decisions in all operation
processes, customers will have adequate visibility into underlying thought behind every
decision made and process outcomes.
Insurers may offer incentives to their customers for using and sharing health data generated by
IoT devices. They can reward customers for using IoT devices to keep track of their routine
activities and adherence to treatment plans and precautionary health measures. This will help
insurers to reduce claims significantly. IoT devices can also enable insurance companies to
validate claims through the data captured by these devices.
Redefining Healthcare
The proliferation of healthcare-specific IoT products opens up immense opportunities. And the
huge amount of data generated by these connected devices hold the potential to transform
healthcare.
IoT has a four-step architecture that are basically stages in a process (See Figure 1). All four
stages are connected in a manner that data is captured or processed at one stage and yields the
value to the next stage. Integrated values in the process brings intuitions and deliver dynamic
business prospects.
Step 1: First step consists of deployment of interconnected devices that includes sensors,
actuators, monitors, detectors, camera systems etc. These devices collect the data.
Step 2: Usually, data received from sensors and other devices are in analog form, which need to
be aggregated and converted to the digital form for further data processing.
Step 3: Once the data is digitized and aggregated, this is pre-processed, standardized and
moved to the data center or Cloud.
Step 4: Final data is managed and analyzed at the required level. Advanced Analytics, applied to
this data, brings actionable business insights for effective decision-making.
IoT is redefining healthcare by ensuring better care, improved treatment outcomes and
reduced costs for patients, and better processes and workflows, improved performance and
patient experience for healthcare providers.
The major advantages of IoT in healthcare include:
Cost Reduction: IoT enables patient monitoring in real time, thus significantly cutting
down unnecessary visits to doctors, hospital stays and re-admissions
Improved Treatment: It enables physicians to make evidence-based informed decisions
and brings absolute transparency
Faster Disease Diagnosis: Continuous patient monitoring and real time data helps in
diagnosing diseases at an early stage or even before the disease develops based on
symptoms
Proactive Treatment: Continuous health monitoring opens the doors for providing
proactive medical treatment
Drugs and Equipment Management: Management of drugs and medical equipment is a
major challenge in a healthcare industry. Through connected devices, these are
managed and utilized efficiently with reduced costs
Error Reduction: Data generated through IoT devices not only help in effective decision
making but also ensure smooth healthcare operations with reduced errors, waste and
system costs
Healthcare IoT is not without challenges. IoT-enabled connected devices capture huge amounts
of data, including sensitive information, giving rise to concerns about data security.
Implementing apt security measures is crucial. IoT explores new dimensions of patient care
through real-time health monitoring and access to patients’ health data. This data is a goldmine
for healthcare stakeholders to improve patient’s health and experiences while making revenue
opportunities and improving healthcare operations. Being prepared to harness this digital
power would prove to be the differentiator in the increasingly connected world.
Factor affecting IoT Healthcare Application
There are various factors that affect the IoT healthcare application. Some of them are mention
below:
o Continuous Research: It requires continuous research in every field (smart devices, fast
communication channel, etc.) of healthcare to provide a fast and better facility for
patients.
o Smart Devices: Need to use the smart device in the healthcare system. IoT opens the
potential of current technology and leads us toward new and better medical device
solutions.
o Better Care: Using IoT technology, healthcare professionals get the enormous data of
the patient, analysis the data and facilitate better care to the patient.
The application of the Internet of Things (IoT ) in healthcare transforms it into more smart, fast
and more accurate. There is different IoT architecture in healthcare that brings start health care
system.
Analytics: Healthcare system analyzes the data from sensors and correlates to get healthy
parameters of the patient and on the basis of their analyze data they can upgrade the patient
health.