Final Fog Computing With Margins
Final Fog Computing With Margins
INTRODUCTION
IoT environments generate unprecedented amounts of data that can be useful in many
ways, particularly if analyzed for insights. However, the data volume can overwhelm
today’s storage systems and analytics applications. The Internet of things (IoT) will be
the Internet of future, as we have seen a huge increase in wearable technology, smart
grid, smart home/city and smart connected vehicles. Fog computing is usually
cooperated with cloud computing.
As a result, end users, fog and cloud together form a three layer service delivery
model . Fog computing also shows a strong connection to cloud computing in terms of
characterization. For example, elastic resources (computation, storage and networking)
are the building blocks of both of them, indicating that most cloud computing
technologies can be directly applied to fog computing. However, fog computing has
several unique properties that distinguish it from other existing computing
architectures. The most important is its close distance to end users. It is vital to keep
computing resource at the edge of the network to support latency-sensitive applications
and services. Another interesting property is location-awareness.
Cost Savings
The most significant cloud computing benefit is in terms of IT cost savings. Businesses,
no matter what their type or size, exist to earn money while keeping capital and
operational expenses to a minimum. With cloud computing, you can save substantial
capital costs with zero in-house server storage and application requirements. The lack
of on-premises infrastructure also removes their associated operational costs in the
form of power, air conditioning and administration costs. You pay for what is used and
disengage whenever you like - there is no invested IT capital to worry about. It’s a
common misconception that only large businesses can afford to use the cloud, when in
fact, cloud services are extremely affordable for smaller businesses.
Reliability
With a managed service platform, cloud computing is much more reliable and
consistent than in-house IT infrastructure. Most providers offer a Service Level
Page
Page
Agreement which guarantees 24/7/365 and 99.99% availability. Your organization can
benefit from a massive pool of redundant IT resources, as well as quick failover
mechanism - if a server fails, hosted applications and services can easily be transited to
any of the available servers.
Manageability
Strategic Edge
Downtime
As cloud service providers take care of a number of clients each day, they can become
overwhelmed and may even come up against technical outages. This can lead to your
business processes being temporarily suspended. Additionally, if your internet
connection is offline, you will not be able to access any of your applications, server or
data from the cloud.
Security
Although cloud service providers implement the best security standards and industry
certifications, storing data and important files on external service providers always
Page
Page
opens up risks. Using cloud-powered technologies means you need to provide your
service provider with access to important business data. Meanwhile, being a public
service opens up cloud service providers to security challenges on a routine basis. The
ease in procuring and accessing cloud services can also give nefarious users the ability
to scan, identify and exploit loopholes and vulnerabilities within a system. For instance,
in a multi-tenant cloud architecture where multiple users are hosted on the same server,
a hacker might try to break into the data of other users hosted and stored on the same
server. However, such exploits and loopholes are not likely to surface, and the
likelihood of a compromise is not great.
Limited Control
Since the cloud infrastructure is entirely owned, managed and monitored by the service
provider, it transfers minimal control over to the customer. The customer can only
control and manage the applications, data and services operated on top of that, not the
backend infrastructure itself. Key administrative tasks such as server shell access,
updating and firmware management may not be passed to the customer or end user.
On November 19, 2015, Cisco Systems, ARM Holdings, Dell, Intel, Microsoft,
and Princeton University, founded the OpenFog Consortium, to promote interests and
development in fog computing. Cisco Sr. Managing-Director Helder Antunes became
the consortium's first chairman and Intel's Chief IoT Strategist Jeff Fedders became its
first president.
The term “Fog Computing” was introduced by the Cisco Systems as new model to ease
wireless data transfer to distributed devices in the Internet of Things (IoT) network
paradigm. Cisco defines Fog Computing as a paradigm that extends Cloud computing
and services to the edge of the network. Similar to Cloud, Fog provides data, compute,
storage, and application services to end-users. The distinguishing Fog characteristics
are its proximity to end-users, its dense geographical distribution, and its support for
mobility. Services are hosted at the network edge or even end devices such as set-top-
boxes or access points. By doing so, Fog reduces service latency, and improves QoS,
Page
Page
resulting in superior user-experience. Fog Computing supports emerging Internet of
Everything (IoE) applications that demand real-time/predictable latency (industrial
automation, transportation, networks of sensors and actuators). Thanks to its wide
geographical distribution the Fog paradigm is well positioned for real time big data and
real time analytics. Fog supports densely distributed data collection points, hence
adding a fourth axis to the often mentioned Big Data dimensions [4].
Fog Computing enables a new breed of applications and services, and that there is a
fruitful interplay between the Cloud and the Fog, particularly when it comes to data
management and analytics. The Fog vision was conceived to address applications and
services that do not fit well the paradigm of the Cloud [6]. They include:
• Applications that require very low and predictable latency—the Cloud frees the user
from many implementation details, including the precise knowledge of where the
computation or storage takes place. This freedom from choice, welcome in many
Page
Page
circumstances becomes a liability when latency is at premium (gaming, video
conferencing).
• Large-scale distributed control systems (smart grid, connected rail, smart traffic light
systems).
2. SYSTEM DESIGN
In previous chapter we have seen introduction of fog computing so, in this chapter the
role of fog computing in IoT(Internet of Things),designing goals and the system design
and components of fog computing are described.
Page
Page
2.1. Role of fog computing in IoT :
2.Wireless Sensor and Actuator Networks : The real Wireless Sensor Nodes (WSNs),
were designed to operate at particularly low power in order to extend battery life or
even to make energy reaping achievable. Most of these WSNs involve a large number
of less bandwidth, less energy, very low processing power, trivial memory motes,
operating as a sources of a sink (collector), in a unidirectional fashion [3].
3.IoT and Cyber – Physical System (CPSs) : Fogging based systems are becoming a
significant class of IoT and CpSs IoT is a network that can interrelate ordinary physical
objects with identified addresses. CPSs article a constricted combination of the systems
computational and physical elements. CPSs also organize the incorporation of
computer and data centric physical engineered systems [3].
4.Software Defined Networks (SDN): SDN concept along with fogging will determine
the main problem in vehicular networks, irregular connectivity, collisions and high
packet loss, by supplementing vehicle to vehicle with vehicle to infrastructure
communication and unified control [3].
There are several designing goals for an adequate fog computing platform.
1. Latency.
It is fundamental for fog computing platform to offer end user low-latency-
guaranteed applications and services. The latency comes from the execution
time of a task, the task offloading time, the time for cyber foraging and speed of
decisions making, etc [3].
Page
Page
2. Efficiency.
While at first glance the efficiency may have its own impact on latency, it is
more related to the efficient utilization of resources and energy [3] . The reasons
are obvious and quite different from counterparts in cloud computing scenarios:
Not all fog nodes are resource rich; some of them have limited
computation power, memory and storage.
Most of fog nodes and clients are battery-powered, such as hand-hold
devices, wearable’s, and wireless sensor units.
3. Generality.
Due to the heterogeneity of fog node and client, we need provide same abstract
to top layer applications and services for fog clients. General application
programming interfaces (APIs) should be provided to cope with existing
protocols and APIs (e.g. Machine-2- machine protocols, smart vehicle/smart
appliance APIs etc) [3] .
1. Heterogeneity:
The origins of the Fog can be traced to early proposals to support endpoints
with rich services at the edge of the network, including applications with low
latency requirements (e.g. gaming, video streaming, and augmented reality) [5].
3. Geographical distribution:
Page
Page
In sharp contrast to the more centralized Cloud, the services and applications
targeted by the Fog demand widely distributed deployments. The Fog, for
instance, will play an active role in delivering high quality streaming to moving
vehicles, through proxies and access points positioned along highways and
tracks [5].
To monitor the environment and the Smart Grid are other examples of
inherently distributed systems, requiring distributed computing and storage
resources [5].
7. Real-time interactions:
Page
Page
should facilitate seamless resource management across the diverse set of
platforms.
The Fog platform hosts diverse set of applications belonging to various verticals
smart connected vehicles to smart cities, oil and gas, smart grid etc. Fog
architecture should expose generic APIs that can be used by the diverse set of
applications to leverage Fog platform.
The Fog platform should provide necessary means for distributed policy-based
orchestration, resulting in scalable management of individual subsystems and
the overall service [4].
Fog nodes are heterogeneous in nature. They range from high end servers, edge routers,
access points, set-top boxes, and even end devices such as vehicles, sensors, mobile
phones etc. The different hardware platforms have varying levels of RAM, secondary
storage, and real estate to support new functionalities. The platforms run various kinds
of OSes, software applications resulting in a wide variety of hardware and software
capabilities.
The Fog network infrastructure is also heterogeneous in nature, ranging from high-
speed links connecting enterprise data centers and the core to multiple wireless access
technologies (ex: 3G/4G, LTE, WiFi etc.) towards the edge [4].
Page
Page
Fig 2.1: Components in Fog Architecture.
The Fog abstraction layer hides the platform heterogeneity and exposes a uniform and
programmable interface for seamless resource management and control.
The layer provides generic APIs for monitoring, provisioning and controlling physical
resources such as CPU, memory, network and energy. The layer also exposes generic
APIs to monitor and manage various hypervisors, OSes, service containers, and service
instances on a physical machine (discussed more later) [4].
Page
Page
improve resource utilization. Virtualization enables the abstraction layer to support
multi-tenancy. The layer exposes generic APIs to specify security, privacy and isolation
Data and resource isolation guarantees for the different tenants on the same
physical infrastructure [4].
The capabilities to inflict no collateral damage to the different parties at the
minimum [4].
Expose a single, consistent model across physical machine to provide these
isolation services [4].
The abstraction layer exposes both the physical and the logical (per-tenant)
network to administrators, and the resource usage per-tenant [4].
A software agent with reasonably small footprint yet capable of bearing the
orchestration functionality and performance requirements that could be
embedded in various edge devices.
A distributed, persistent storage to store policies and resource meta-data
(capability, performance, etc) that support high transaction rate update and
retrieval.
A scalable messaging bus to carry control messages for service orchestration
and resource management.
A distributed policy engine with a single global view and local enforcement [4].
The distributed Fog orchestration framework consists of several Foglet software agents,
one running on every node in the Fog platform. The Foglet agent uses abstraction layer
APIs to monitor the health and state associated with the physical machine and services
Page
Page
deployed on the machine. This information is both locally analyzed and also pushed to
the distributed storage for global processing [4].
The Fog software framework exposes northbound APIs that applications use to
effectively leverage the Fog platform. These APIs are broadly classified into data and
control APIs. Data APIs allow an application to leverage the Fog distributed data store.
Control APIs allow an application to specify how the application should be deployed
on the Fog platform [4].
Page
Page
Fig2.2: Policy-based orchestration framework
Page
Page
IoT will create enormous amounts of data, driving a need for distributed
intelligent data management and so-called 'fast' Big Data processing [4].
The above figure illustrates the notion of some data being pre-processed and potentially
used in real-time whereas other data is stored or even archived for much later use in a
more centralized cloud infrastructure or platform environment [4].
Every communication deployment of IoT is unique. However, there are four basic
stages that are common to just about every IoT application. Those components are: data
collection, data transmission, data assessment, and response to the available
information. Successful data management is therefore very important to the success of
IoT [4].
Page
Page
Data Management for IoT can be viewed as a two-part system Online/Real-time Front-
end (e.g. distributed nodes) and Off-line Back-end (centralized Cloud storage). The
Online/real-time portion of the system is concerned with data management associated
with distributed objects/assets/devices and their associated sensors. As we discuss later
in this report, there are issues pertaining to the need for “fast data” and distributed
intelligence to deal with this data.
The Front-end also passes data (in the form of proactive push and responses to queries)
results from the objects/devices/sensors to the Back-end. The frequent communication
between Frontend and Backend is termed as Online. The Back-end is storage-intensive;
storing select data produced from disparate sources and also supports in-depth queries
and analysis over the long-term as well as data archival needs [4].
There will also be a need for advanced Data Virtualization techniques for IoT Data.
Data virtualization is any approach to data management that allows an application to
retrieve and manipulate data without requiring technical details about the data, such as
how it is formatted or where it is physically located. An example of a leading company
in this area is Cisco, whose Data Virtualization offering represents an agile data
integration software solution that makes it easy to abstract and view data, regardless of
where it resides. With their integrated data platform, a business can query various types
of data across the network as if it were in a single place [4].
There are also data infrastructure issues to consider with IoT Data. Three important
DB/infrastructure issues to consider for IoT Data Management are:
In previous chapter we have seen the system design and components of fog computing
so, in this chapter the overall working of fog computing for data processing, data
storage, data transmission and the data compute is described.
Page
Page
Fig 3.1 : Distributed data processing in a fog-computing environment.
Page
Page
It supports user mobility, resource and interface heterogeneity, and distributed data
analytics to address the requirements of widely distributed applications that need low
latency [1].
Developers either port or write IoT applications for fog nodes at the network edge. The
fog nodes closest to the network edge ingest the data from IoT devices [2]. Then and
this is crucial the fog IoT application directs different types of data to the optimal place
for analysis:
The most time-sensitive data is analyzed on the fog node closest to the things
generating the data. In a Cisco Smart Grid distribution network, for example,
the most time-sensitive requirement is to verify that protection and control loops
are operating properly. Therefore, the fog nodes closest to the grid sensors can
look for signs of problems and then prevent them by sending control commands
to actuators [2].
Data that can wait seconds or minutes for action is passed along to an
aggregation node for analysis and action. In the Smart Grid example, each
substation might have its own aggregation node that reports the operational
status of each downstream feeder and lateral [2].
Data that is less time sensitive is sent to the cloud for historical analysis, big
data analytics, and long-term storage. For example, each of thousands or
hundreds of thousands of fog nodes might send periodic summaries of grid data
to the cloud for historical analysis and storage.
Fog Networking consists of a control plane and a data plane, where most of the
processing takes place in the data plane of a smart mobile or on the edge of the network
in a gateway device [2].
While edge devices and sensors are where data is generated and collected, they don’t
have the compute and storage resources to perform advanced analytics and machine –
learning tasks [2].
Though cloud servers have the power to do these, they are often too far away to process
the data and respond in a timely manner [2].
In addition, having all endpoints connecting to and sending raw data to the cloud over
the internet can have privacy, security and legal implications, especially when dealing
with sensitive data subject to regulations in different countries.
In a fog environment, the processing takes place in a data hub on a smart device, or in a
smart router or gateway, thus reducing the amount of data sent to the cloud. It is
important to note that fog networking complements not replaces cloud computing
fogging allows for short term analytics at the edge, and cloud performs resource-
intensive, longer-term analytics.
Fog computing can be perceived both in large cloud systems and big data structures,
making reference to the growing difficulties in accessing information objectively. This
results in a lack of quality of the obtained content. The effects of fog computing on
cloud computing and big data systems may vary; yet, a common aspect that can be
Page
Page
extracted is a limitation in accurate content distribution, an issue that has been tackled
with the creation of metrics that attempt to improve accuracy [2].
Fog networking consists of a control plane and a data plane. For example, on the data
plane, fog computing enables computing services to reside at the edge of the network as
opposed to servers in a data-center. Compared to cloud computing, fog computing
emphasizes proximity to end-users and client objectives, dense geographical
distribution and local resource pooling, latency reduction for quality of service and
Page
Page
edge analytics/stream mining, resulting in superior user-experience and redundancy in
case of failure [2].
Fog nodes will Receive feeds from IoT devices using any protocol, in real time. Run
IoT-enabled applications for real-time control and analytics, with millisecond response
time then Provide transient storage, often 1–2 hours and Send periodic data summaries
to the cloud after this at the cloud platform the cloud Receives and aggregates data
summaries from many fog nodes Performs analysis on the IoT data and data from other
sources to gain business insight and can send new application rules to the fog nodes
based on these insights.
Page
Page
3.3. How Fog Computing will Help To Control the Traffic?
These systems will communicate with each other say every 15 minutes.
The DM or the local server will communicate to the other local server servers in
every 10 minutes.
Page
Page
communicate with the other systems with the help of communicator. And this is
how the other systems will get information about the heavy traffic in that area.
The sensors will detect the number of vehicles on the zebra crossing.
If the number of vehicles is more than the system will not allow the
If the number of vehicles is less then it will give the red signal to them and then
allow the pedestrians to cross the road.
If the decision makers were on the cloud far away from the system location then
it would have taken a huge time in taking the decision as well as it would cause
a delay.
As mentioned earlier the Fog Computing benefits will help this Smart Traffic
Light system to work efficiently in a real time [2].
Page
Page
Agencies that run the system must co ordinate
Multi-Agencies orchestration
control law policies in real time.
However, constructing a real IoT environment as a test bed for evaluating such
techniques is costly and doesn’t provide a controllable environment for conducting
repeatable experiments. To overcome this limitation, we developed an open source
simulator called iFog Sim. iFog Sim enables the modelling and simulation of fog-
computing environments for the evaluation of resource-management and scheduling
policies across edge and cloud resources under multiple scenarios, based on their
impact on latency, energy consumption, network congestion, and operational costs. It
measures performance metrics and simulates edge devices, cloud data centres, sensors
[6].
In the previous chapter s we have seen the fog architecture and working of fog
computing. In this chapter the advantages and applications are discussed. Also the
difference between the fog computing and cloud computing is given in this chapter.
Page
Page
QoS is an important metric for fog service and can be divided into four aspects,
1)connectivity, 2) reliability, 3) capacity, and 4) delay.
Connectivity:
In a heterogeneous fog network, network relaying, partitioning and clustering
provide new opportunities for reducing cost, trimming data and expanding
connectivity. For example, an ad-hoc wireless sensor network can be partitioned
into several clusters due to the coverage of rich-resource fog nodes (cloudlet,
sink node, powerful smart phone, etc.). Similarly, the selection of fog node from
end user will heavily impact the performance. We can dynamically select a
subset of fog nodes as relay nodes for optimization goals of maximal
availability of fog services for a certain area or a single user, with constraints
such as delay, throughput, connectivity, and energy consumption [7].
Reliability:
Normally, reliability can be improved through periodical check-pointing to
resume after failure, rescheduling of failed tasks or replication to exploit
executing in parallel. But check pointing and rescheduling may not suit the
highly dynamic fog computing environment since there will be latency, and
cannot adapt to changes. Replication seems more promising but it relies on
multiple fog nodes to work together [7].
Capacity:
Capacity has two folds: 1) network bandwidth, 2) storage capacity. In order to
achieve high bandwidth and efficient storage utilization, it is important to
investigate how data are placed in fog network since data locality for
computation is very important. There are similar works in the context of cloud,
and sensor network . However, this problem faces new challenges in fog
computing. For example, a fog node may need to compute on data that is
distributed in several nearby nodes. Data placement in federation of fog and
cloud also needs critical thinking. The challenges come from how to design
interplay between fog and cloud to accommodate different workloads. Due to
the dynamic data placement and large overall capacity volume in fog
computing, we may also need to redesign search engine which can process
search query of content scattered in fog nodes [7].
Page
Page
Delay Latency:
Sensitive applications, such as streaming mining or complex event processing,
are typical applications which need fog computing to provide real-time
streaming processing rather than batch processing. propose a fog-based
opportunistic spatiotemporal event processing system to meet the latency
requirement [7].
From Table 4.1, it can be seen that Cloud Computing characteristics have very severe
limitations with respect to quality of service demanded by real time applications
requiring almost immediate action by the server.
Page
Page
Support for Mobility Limited Supported
2. Eliminates the core computing environment, thereby reducing a major block and a
point of failure [3].
3. Improves the security, as data are encoded as it is moved towards the network edge
[3].
Smart Grids
Smart grid is another application where fog computing is been used. Based on
demand for energy, its obtain ability and low cost, these smart devices can
switch to other energies like solar and winds. The edge process the data
collected by fog collectors and generate control command to the actuators. The
filtered data are consumed locally and the balance to the higher tiers for
visualization, real-time reports and transactional analytics. Fog supports semi-
permanent storage at the highest tier and momentary storage at the lowest tier
[2].
Fog computing can be used with smart utility services, whose focus is im-
proving energy generation, delivery, and billing. In such environments, edge
devices can report more fine-grained energy-consumption details (for example,
hourly and daily, rather than monthly, readings) to users’ mobile devices than
traditional smart utility services. These edge devices can also calculate the cost
of power consumption throughout the day and suggest which energy source is
most economical at any given time or when home appliances should be turned
on to minimize utility use [2].
Connected car
Autonomous vehicle is the new trend taking place on the road. Tesla is working
on software to add automatic steering, enabling literal "hands free" operations
of the vehicle. Starting out with testing and releasing self-parking features that
don't require a person behind the wheel. Within 2017 all new cars on the road
will have the capability to connect to cars nearby and internet. Fog computing
will be the best option for all internet connected vehicles why because fog
Page
Page
computing gives real time interaction. Cars, access point and traffic lights will
be able to interact with each other and so it makes safe for all. At some point in
time, the connected car will start saving lives by reducing automobile accidents
[2].
Big data processing is a hot topic for big data architecture in the cloud and
mobile cloud. Fog computing can provide elastic resources to large scale data
process system without suffering from the drawback of cloud, high latency. In
cloud computing paradigm, event or data will be transmitted to the data center
inside core network and result will be sent back to end user after a series of
processing. A federation of fog and cloud can handle the big data acquisition,
aggregation and pre-processing, reducing the data transportation and storage,
balancing computation power on data processing. For example, in a large scale
environment monitoring system, local and regional data can be aggregated and
mined at fog nodes providing timely feedback especially for emergency case
Page
Page
such as toxic pollution alert. While detailed and thorough analysis as
computational-intensive tasks can be scheduled in the cloud side. We believe
data processing in the fog will be the key technique to tackle analytics on large
scale of data generated by applications of IoT [2].
Big Data has emerged in earnest the past couple of years and with such an
emergence the Cloud became the architecture of choice. All but the most well
financed organizations find it feasible to access the massive quantities of Big
Data via the virtual resources of the Cloud, with its nearly infinite scalability
and on-demand pay structure [2].
Just as cloud has created new business models, growth and industries, “fog can
eventually do the same,” who foresees the “excitement of having new vendors, new
industries, new businesses models come out of this as the industry, working together
with academia to address the challenges and solve real business problems with these
new architectural approaches.”
Fog computing “will provide ample opportunities for creating new applications and
services that cannot be easily supported by the current host-based and cloud-based
application platforms,” For example, new fog-based security services will be able to
help address many challenges we are facing in helping to secure the Internet of Things.
Fog computing was introduced to meet three primary goals-
To improve efficiency and trim the amount of data that requires to be transmitted for
processing, analysis and storage.
Fog Networking consists of a control plane and a data plane, where most of the
processing takes place in the data plane of a smart mobile or on the edge of the network
in a gateway device.
Page
Page
Developing these services at the edge through fog computing will lead to new business
models and opportunities for network operators.
5. CONCLUSION
Fog computing performs better than cloud computing. Processing data closer to where
it is produced and needed. It also protects sensitive IoT data. fog computing will grow
in helping the emerging network paradigms that require
faster processing with less delay. By using the concepts of fog computing, if the same
device can be used for these kind of processing, data generated can be put to immediate
use and deliver a much better user experience.
Page
Page
REFERENCES
[2]. Internet of Things by Rajkumar Buyya, & Amir Vahid Dastjerdi, 1st Edition.
[4]. F. Bonomi, “Connected vehicles, the internet of things, and fog com-
puting,” in The Eighth ACM International Workshop on Vehicular Inter-
Networking (VANET), Las Vegas, USA, 2011.
Page
Page
### Introduction to Fog Computing
Fog computing is an extension of cloud computing that brings computation, storage,
and networking services closer to the end-user.
It is designed to reduce latency, improve efficiency, and enhance security by processing
data at the edge of the network instead of relying solely on centralized cloud servers.
Page
Page
| **Reliability** | High | Dependent on network connectivity |
Page
Page
|--------------|--------------|----------------|
| **Latency** | Low | High |
| **Data Processing** | Near the source | Centralized in the cloud |
| **Security** | Higher | Moderate |
| **Bandwidth Usage** | Optimized | High |
| **Reliability** | High | Dependent on network connectivity |
Page
Page
- **Agriculture**: Smart farming solutions use fog computing to analyze weather
patterns, soil conditions, and crop health.
Page
Page
processing medical data locally.
- **Industrial IoT (IIoT)**: Factories use fog computing for predictive maintenance,
automation, and real-time analytics.
- **Autonomous Vehicles**: Self-driving cars require real-time processing to make
split-second decisions, which fog computing facilitates.
- **Agriculture**: Smart farming solutions use fog computing to analyze weather
patterns, soil conditions, and crop health.
Page
Page
### Applications of Fog Computing
- **Smart Cities**: Fog computing helps in managing traffic signals, public safety, and
environmental monitoring efficiently.
- **Healthcare**: Enables real-time patient monitoring and faster diagnosis by
processing medical data locally.
- **Industrial IoT (IIoT)**: Factories use fog computing for predictive maintenance,
automation, and real-time analytics.
- **Autonomous Vehicles**: Self-driving cars require real-time processing to make
split-second decisions, which fog computing facilitates.
- **Agriculture**: Smart farming solutions use fog computing to analyze weather
patterns, soil conditions, and crop health.
Page
Page
reducing exposure to cyber threats.
4. **Scalability**: Fog computing supports the growing number of IoT devices and
applications by offloading cloud workloads.
5. **Bandwidth Optimization**: By processing data locally, it reduces the amount of
data that needs to be sent over the internet, lowering bandwidth costs.
Page
Page
1. **Decentralized Computing**: Unlike cloud computing, fog computing distributes
data processing tasks across multiple nodes close to the data source.
2. **Reduced Latency**: Since data processing happens near the source, response
times are significantly improved.
3. **Enhanced Security**: Data is processed locally before being sent to the cloud,
reducing exposure to cyber threats.
4. **Scalability**: Fog computing supports the growing number of IoT devices and
applications by offloading cloud workloads.
5. **Bandwidth Optimization**: By processing data locally, it reduces the amount of
data that needs to be sent over the internet, lowering bandwidth costs.
Page
Page
and networking services closer to the end-user.
It is designed to reduce latency, improve efficiency, and enhance security by processing
data at the edge of the network instead of relying solely on centralized cloud servers.
Page
Page
will further evolve, making smart applications more effective and reliable.
Page
Page
### Future of Fog Computing
As the Internet of Things (IoT) continues to expand, fog computing will play an
essential role in managing vast amounts of data efficiently.
With advancements in artificial intelligence (AI) and edge computing, fog computing
will further evolve, making smart applications more effective and reliable.
Page
Page
| **Latency** | Low | High |
| **Data Processing** | Near the source | Centralized in the cloud |
| **Security** | Higher | Moderate |
| **Bandwidth Usage** | Optimized | High |
| **Reliability** | High | Dependent on network connectivity |
Page
Page
patterns, soil conditions, and crop health.
Page
Page
- **Industrial IoT (IIoT)**: Factories use fog computing for predictive maintenance,
automation, and real-time analytics.
- **Autonomous Vehicles**: Self-driving cars require real-time processing to make
split-second decisions, which fog computing facilitates.
- **Agriculture**: Smart farming solutions use fog computing to analyze weather
patterns, soil conditions, and crop health.
Page
Page
### Applications of Fog Computing
- **Smart Cities**: Fog computing helps in managing traffic signals, public safety, and
environmental monitoring efficiently.
- **Healthcare**: Enables real-time patient monitoring and faster diagnosis by
processing medical data locally.
- **Industrial IoT (IIoT)**: Factories use fog computing for predictive maintenance,
automation, and real-time analytics.
- **Autonomous Vehicles**: Self-driving cars require real-time processing to make
split-second decisions, which fog computing facilitates.
- **Agriculture**: Smart farming solutions use fog computing to analyze weather
patterns, soil conditions, and crop health.
Page
Page
4. **Scalability**: Fog computing supports the growing number of IoT devices and
applications by offloading cloud workloads.
5. **Bandwidth Optimization**: By processing data locally, it reduces the amount of
data that needs to be sent over the internet, lowering bandwidth costs.
Page
Page
data processing tasks across multiple nodes close to the data source.
2. **Reduced Latency**: Since data processing happens near the source, response
times are significantly improved.
3. **Enhanced Security**: Data is processed locally before being sent to the cloud,
reducing exposure to cyber threats.
4. **Scalability**: Fog computing supports the growing number of IoT devices and
applications by offloading cloud workloads.
5. **Bandwidth Optimization**: By processing data locally, it reduces the amount of
data that needs to be sent over the internet, lowering bandwidth costs.
Page
Page